Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Cannot fetch some of the feeds #1413

Open
TheExceeder opened this issue May 20, 2024 · 2 comments
Open

[BUG]: Cannot fetch some of the feeds #1413

TheExceeder opened this issue May 20, 2024 · 2 comments
Assignees
Labels
Component-Plugins-Greader Status-Not-Enough-Data Ticket creator must append more precise info to the ticket. Status-Not-Our-Bug Bug is present in some upstream libraries used by RSS Guard. Type-Defect This is BUG!!!

Comments

@TheExceeder
Copy link

Brief description of the issue

RSS Guard is fetching new articles from my FreshRSS instance. However, there are several feeds that I'm not able to fetch - I get an error (see the log) and the name of the feed turns red. This happens with feeds that have a large amount of articles in them (e.g., Hackaday, TechRadar).

This is also happening when I have a group of several feeds and try to fetch them as a group - it fails. However, when done individually, it works just fine.

I believe this issue is not related to FreshRSS, as I'm using other clients connected to it and they are able to fetch the articles without any issue.

How to reproduce the bug?

  1. Connect RSS to a FreshRSS server
  2. Add TechRadar feed.
  3. Wait for it to contain several thousand of articles.
  4. RSSGuard won't be able to fetch it anymore.

What was the expected result?

Fetching of all articles without an error. Alternatively, providing a more detailed information on what failed and how to fix it.

What actually happened?

See the description of the issue.

Debug log

time="326411.388" type="debug" -> feed-downloader: Starting feed updates from worker in thread '0x64bc'.
time="326411.388" type="debug" -> feed-downloader: Synchronizing cache back to server on thread '0x64bc'.
time="326411.388" type="debug" -> feed-downloader: All caches synchronized.
time="326411.388" type="debug" -> database: SQLite connection 'db_connection_25788' is already active.
time="326411.388" type="debug" -> database: SQLite database connection 'db_connection_25788' to file 'file::memory:' seems to be established.
time="326411.698" type="debug" -> core: Delaying scheduled feed auto-download for some time since window is focused and updates while focused are disabled by the user and all account caches are empty.
time="326411.730" type="debug" -> greader: Percentage of feeds for fetching: '1.51515'.
time="326411.730" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time="326411.990" type="debug" -> network: Destroying Downloader instance.
time="326411.990" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time="326411.995" type="warning" -> greader: Performing feed-based contents fetching.
time="326411.996" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time="326412.404" type="debug" -> network: Destroying Downloader instance.
time="326412.404" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time="326412.452" type="debug" -> feed-downloader: Downloading new messages for feed ID 'feed/9' URL: 'https://hackaday.com/' title: 'Blog – Hackaday' in thread '25956'.
time="326412.452" type="debug" -> database: SQLite connection 'db_connection_25956' is already active.
time="326412.452" type="debug" -> database: SQLite database connection 'db_connection_25956' to file 'file::memory:' seems to be established.
time="326412.452" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time="326413.866" type="debug" -> network: Destroying Downloader instance.
time="326413.866" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time="326413.878" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time="326413.946" type="debug" -> network: Destroying Downloader instance.
time="326413.946" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time="326413.972" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time="326418.618" type="debug" -> network: Destroying Downloader instance.
time="326418.618" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time="326418.625" type="critical" -> greader: Cannot download messages for QList(tag:google.com,2005:reader/item/0005ef11407c7ed4, tag:google.com,2005:reader/item/0005feed71c8003d, tag:google.com,2005:reader/item/0005dd946c7ae1eb, tag:google.com,2005:reader/item/00060bb395017d65, ... (excluding several hundreds of list items) ), network error: 'QNetworkReply::InternalServerError'.
time="326418.627" type="critical" -> network: Unknown error when fetching feed:message: 'unknown error (InternalServerError)'.
time="326418.627" type="debug" -> feed-downloader: Made progress in feed updates, total feeds count 1/1 (id of feed is 23).
time="326418.788" type="debug" -> feed-downloader: Finished feed updates in thread '0x64bc'.
time="326418.803" type="debug" -> CTRL is NOT pressed while sorting articles - sorting with standard mode.
time="326418.813" type="debug" -> message-model: Repopulated model, SQL statement is now:
'SELECT Messages.id, Messages.is_read, Messages.is_important, Messages.is_deleted, Messages.is_pdeleted, Messages.feed, Messages.title, Messages.url, Messages.author, Messages.date_created, Messages.contents, Messages.enclosures, Messages.score, Messages.account_id, Messages.custom_id, Messages.custom_hash, Feeds.title, Feeds.is_rtl, CASE WHEN LENGTH(Messages.enclosures) > 10 THEN 'true' ELSE 'false' END AS has_enclosures, (SELECT GROUP_CONCAT(Labels.name) FROM Labels WHERE Messages.labels LIKE "%." || Labels.custom_id || ".%") as msg_labels, Messages.labels FROM Messages LEFT JOIN Feeds ON Messages.feed = Feeds.custom_id AND Messages.account_id = Feeds.account_id WHERE Feeds.custom_id IN ('feed/9') AND Messages.is_deleted = 0 AND Messages.is_pdeleted = 0 AND Messages.account_id = 1 ORDER BY Messages.date_created DESC, Messages.id DESC;'.
time="326418.813" type="debug" -> gui: Reloading of msg selections took 11 miliseconds.
time="326421.690" type="debug" -> core: Delaying scheduled feed auto-download for some time since window is focused and updates while focused are disabled by the user and all account caches are empty.
time="326431.691" type="debug" -> core: Delaying scheduled feed auto-download for some time since window is focused and updates while focused are disabled by the user and all account caches are empty.

Operating system and version

  • OS: Windows 11 Version 23H2 (but the issue was present also on various builds of Windows 10)
  • RSS Guard version: 4.7.0 (built on Windows/AMD64) (however, this issue is persistent for quite some time - probably from the beginning of me using this app ~a year ago)
@martinrotter
Copy link
Owner

martinrotter commented May 20, 2024

Your steps to reproduce are waaaay to generic.

Add precise (VERY precise) set of steps to reproduce - exactly what button is clicked, where, how feed is added, all of steps, please. Also post screenshot of how your FreshRSS account in RSS Guard setup dialog looks like.

@martinrotter
Copy link
Owner

Followup: There are HTTP/500 errors. You have to deal with this upstream with FreshRSS devs as their API is giving the error - it will be visible in their logs.

@martinrotter martinrotter added Status-Not-Enough-Data Ticket creator must append more precise info to the ticket. Status-Not-Our-Bug Bug is present in some upstream libraries used by RSS Guard. Component-Plugins-Greader labels May 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Component-Plugins-Greader Status-Not-Enough-Data Ticket creator must append more precise info to the ticket. Status-Not-Our-Bug Bug is present in some upstream libraries used by RSS Guard. Type-Defect This is BUG!!!
Projects
None yet
Development

No branches or pull requests

2 participants