Hi!
I read in the FAQs that people have been using Bookpedia with more than 13000 (-ish) books in the system and it works fine.
But I wonder if there is an upper limit?
Will it handle 250 000 or 2 500 000 entries? Is there any limitation in SQLite? Or does it get sluggish at, say, 200 000 entries?
Maximum entries in Bookpedia?
Re: Maximum entries in Bookpedia?
SQLite has a limitation of 18446744073709551616 rows in total. Even though we don't store the image in the database, each row (book) has enough data that you would reach the page limit size before reaching the number of books limit. This page limit size translates to about 140,000 gigabytes, therefore your likely to run into disk space available limit before reaching the page limit. That is the theory, usability it another story.
Depending on how you're using Bookpedia you will notice it sluggish at different sizes. We unload most of the work to SQL, but there are certain things we have to bring the data into memory to handle. The prices is the main example, SQL knows nothing about decimals so we do those in memory. But even with 10,000 books that still only a few tens of megabytes. The operations that require the longest are smart collections (they have to look at every record to find out which records match the collection rules) and this feature is where you will notice the slow down first (or gathering statistics for the entire library). With many smart collections I would place the still happy and having fun with Bookpedia limit at about 20,000 books, without them at about 50,000. I would like to think you could go beyond that, but no one ever has. If someone has they have not returned to tell the tale of what lies beyond into the hundred of thousands.
Actually there is one hard coded limit by us and that is the moderator collections internal ID starts at 1,000,000,000, to keep them apart. The billion and one book added would look to Bookpedia as being in a Doghouse moderator collection.
Depending on how you're using Bookpedia you will notice it sluggish at different sizes. We unload most of the work to SQL, but there are certain things we have to bring the data into memory to handle. The prices is the main example, SQL knows nothing about decimals so we do those in memory. But even with 10,000 books that still only a few tens of megabytes. The operations that require the longest are smart collections (they have to look at every record to find out which records match the collection rules) and this feature is where you will notice the slow down first (or gathering statistics for the entire library). With many smart collections I would place the still happy and having fun with Bookpedia limit at about 20,000 books, without them at about 50,000. I would like to think you could go beyond that, but no one ever has. If someone has they have not returned to tell the tale of what lies beyond into the hundred of thousands.
Actually there is one hard coded limit by us and that is the moderator collections internal ID starts at 1,000,000,000, to keep them apart. The billion and one book added would look to Bookpedia as being in a Doghouse moderator collection.