If you're a systemd user you can easily run it on startup
$ systemctl --user enable canto-daemon
Or start it manually with
$ systemctl --user start canto-daemon
By default, user sessions start on login, and end on logoff stopping daemons. This is good behavior, but if you don't want canto-daemon to stop when you log out (to keep gathering news) enable "lingering" for your account.
$ loginctl enable-linger
This will start one session for you on boot that will last until shutdown.
I currently notice that feeds are vanishing, even if keep_unread is set to True.
Or is this option gone with 0.9?
(On another note. Somewhere a place where all options can be seen? Autocompleting through all options isn't very effective :D. Maybe a skeleton file could be added (or example config) to have a start base)
File .canto-ng/feeds grows and grows until it reaches the point when I couldn't download it over ssh and client breaks and do not work. Then I delete this file and it starts all over again.
I guess this is because of the option I set:
:keep_unread = [True|False]
I am never reading news through the reader itself, I choose interesting topics and press 'g' (open through web-server). Is that why daemon never deletes all old stuff? Will it delete the old stuff if I set this to False? And will I miss anything if this will be set to False, but daemon will work 24/7 on my raspbery PI server?
Btw. I am marking all the news as 'read' (they change color and dissapear from the client view configured with filter)
Got this problem two times now. Dunno how I achieved it.
Trying to start canto-curses and all I get is a empty screen.
Checking the daemon log provides the following:
02:42:43 : SHELF -> Failed to JSON load, old shelf?
02:42:43 : SHELF -> Failed to migrate old shelf: db type could not be determined
Traceback (most recent call last):
File "/usr/lib/python3.5/site-packages/canto_next/storage.py", line 48, in open
self.cache = json.load(fp)
File "/usr/lib/python3.5/json/init.py", line 265, in load
return loads(fp.read(),
File "/usr/lib/python3.5/gzip.py", line 274, in read
return self._buffer.read(size)
File "/usr/lib/python3.5/gzip.py", line 461, in read
if not self._read_gzip_header():
File "/usr/lib/python3.5/gzip.py", line 409, in _read_gzip_header
raise OSError('Not a gzipped file (%r)' % magic)
OSError: Not a gzipped file (b'f\xec')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/bin/canto-daemon", line 7, in
c = CantoBackend()
File "/usr/lib/python3.5/site-packages/canto_next/canto_backend.py", line 123, in init
self.get_storage()
File "/usr/lib/python3.5/site-packages/canto_next/canto_backend.py", line 796, in get_storage
self.shelf = CantoShelf(self.feed_path)
File "/usr/lib/python3.5/site-packages/canto_next/storage.py", line 27, in init
self.open()
File "/usr/lib/python3.5/site-packages/canto_next/feed.py", line 102, in _fl
return fn(*args)
File "/usr/lib/python3.5/site-packages/canto_next/storage.py", line 53, in open
s = shelve.open(self.filename, "r")
File "/usr/lib/python3.5/shelve.py", line 243, in open
return DbfilenameShelf(filename, flag, protocol, writeback)
File "/usr/lib/python3.5/shelve.py", line 227, in init
Shelf.init(self, dbm.open(filename, flag), protocol, writeback)
File "/usr/lib/python3.5/dbm/init.py", line 88, in open
raise error[0]("db type could not be determined")
dbm.error: db type could not be determined
Removing feeds file makes canto responsive again. But still the question how it could end like this
Hi, I just installed canto and I think this must be very simple to solve but I could not find any reference to it.
This is what I get in my daemon-log:
18:50:22 : CANTO-FETCH -> Empty feed, attempt to update.
18:50:22 : CANTO-FETCH -> Empty feed, attempt to update.
18:50:22 : CANTO-FETCH -> Empty feed, attempt to update.
18:50:22 : CANTO-FETCH -> Empty feed, attempt to update.
18:50:22 : CANTO-FETCH -> ERROR: try to parse http://rss.slashdot.org/slashdot/Slashdot, got decoding str is not supported
18:50:22 : CANTO-FETCH -> ERROR: try to parse http://reddit.com/.rss, got decoding str is not supported
18:50:22 : CANTO-FETCH -> ERROR: try to parse http://codezen.org/canto-ng/feed/, got decoding str is not supported
I added one new URL and has the same problem.
Thank you,
EDIT:
I have traced to problem to feedparser but I could not solve it.
I first thought it was a problem of str vs bytes so I appended .encode() to the URL. Then the error became:
17:24:06 : CANTO-FETCH -> Empty feed, attempt to update.
17:24:06 : CANTO-FETCH -> No content in http://codezen.org/canto-ng/feed/: :2:-1: Document is empty
and like this with every possible feed.
Then I tried to use feedparser directly following the examples in http://pythonhosted.org/feedparser/introduction.html but those did not work either:
exemple 1:
import feedparser
d = feedparser.parse('http://feedparser.org/docs/examples/atom10.xml')
Traceback (most recent call last):
File "", line 1, in
File "/usr/lib64/python3.2/site-packages/feedparser-5.1.3-py3.2.egg/feedparser.py", line 3745, in parse
saxparser.parse(source)
File "/usr/lib64/python3.2/site-packages/drv_libxml2.py", line 189, in parse
eltName = (_d(reader.NamespaceUri()),
File "/usr/lib64/python3.2/site-packages/drv_libxml2.py", line 70, in _d
return _decoder(s)[0]
File "/usr/lib64/python3.2/encodings/utf_8.py", line 16, in decode
return codecs.utf_8_decode(input, errors, True)
TypeError: 'str' does not support the buffer interface
example 2:
import feedparser
d = feedparser.parse('http://feedparser.org/docs/examples/atom10.xml'.encode())
d
{'feed': {}, 'encoding': 'utf-8', 'bozo': 1, 'version': '', 'namespaces': {}, 'entries': [], 'bozo_exception': SAXParseException('Do cument is empty\n',)}
See the difference when using bytes and str.
I have canto-next and feedparser from their git repo.
I'm getting severe delays and outright non-responsiveness when running canto-daemon on ARM. This manifests itself as the server eating huge resources due to spawning a large number of threads, and the client communication is slow to the point of unusability.
It would be great to have a hook which will work upon pressing 'g' (upon opening an article in a web-browser). Then one can form his own RSS feed based upon the items he reads (I am basically posting cool articles to my friends and it takes much time... it would be quite an optimisation to post them just an rss :D)
And I have question about ssh: it takes too long to get contents and everything lagging (over LAN). Should it be like this?
I did on server:
canto-daemon -p 31000
I did on client:
canto-curses -a 192.168.100.33 -p 31000
(I am connecting just via LAN, not via ssh. The question is how to do it via ssh and will it be faster than lan and etc.)
UPD: And canto-curses sometimes outputs errno 111 - connection refused
So when I'm at another house with my friends which are using MS Windows, I need to install VirtualBox, Linux and canto-curses in order to just simply read the news, meanwhile a friend of mine simply evil-smiling and using android app to read his rss. So I want to use RSS on the go (with my smartphone/another OS), but it's impossible with canto (because it's only ncurses and linux). Is it possible to implement / or make plugin for compatibility with other RSS readers?
Using canto-daemon 0.8.2 on ArchLinux on laptop. When started it uses at least one cpu at 100% for at least 5 minutes and also when fetching feeds. Was not paying attension until now, when I use more time on battery.
htop screenshot: http://gallery.zebulj.si/var/albums/Razno/screenshot-2013-07-18.jpg?m=1374167699
canto-daemon log: http://zebulj.si/ajaxp/data/public/180638.php
As in the title, I'd like to have something like default_keep in 0.7, to keep a limited number of items.
I've seen the keep (and rate, but that's not related to this issue) attributes in CantoFeed, but I don't see right away where they are used. Setting them does not seem to make a difference. Is this not implemented yet?
Also, are read items ever removed? Is there something akin to never_discard from the previous version?
1:
I have a config file like this:
https://gist.github.com/3765491
Every time I open canto-curses, it will overwrite my new config file with default config file. really crazy.
2:
And I feel the new style of adding (write) feeds is really bad style. looks tooooooo long (if I have many feeds). Can Canto separate the feeds into a separate file ?
Traceback (most recent call last):
File "/usr/bin/canto-daemon", line 7, in <module>
c = CantoBackend()
File "/usr/lib/python3.10/site-packages/canto_next/canto_backend.py", line 125, in __init__
self.get_config()
File "/usr/lib/python3.10/site-packages/canto_next/canto_backend.py", line 794, in get_config
config.parse()
File "/usr/lib/python3.10/site-packages/canto_next/config.py", line 135, in parse
self.read_config()
File "/usr/lib/python3.10/site-packages/canto_next/config.py", line 152, in read_config
self.json = json.load(c)
File "/usr/lib/python3.10/json/__init__.py", line 293, in load
return loads(fp.read(),
File "/usr/lib/python3.10/json/__init__.py", line 346, in loads
return _default_decoder.decode(s)
File "/usr/lib/python3.10/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib/python3.10/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 13 column 5 (char 296)