-
The question may seem odd at first but having worked with SWI-Prolog for some time, currently all of the data for use with Prolog queries needs to be loaded into memory to be used. I have a need that could eventually need access to more data than can be loaded into memory on a standard PC with 16GB of memory. Even going to 128GB may not be enough. Also I am not concerned with how long the queries take, so paying the price in time to access the data from hard drives is not an issue. Does Logica work like a normal SQL database where the data can reside on hard drives and is only accessed as needed? Or does all of the data need to be loaded into memory before a query can be run? Or something else? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Thanks for your question! That's correct! Logica delegates execution of the program to a database engine, which means data can be stored on disks and is accessed as needed. BigQuery, which is currently the only fully-supprted Logica's back-end, is Google Datawarehouse which is capable executing heavy queries over datasets of hundreds of terabytes. In this case the data is stored in a distributed file system. A reasonably complex logic program over a dataset of under 1 terabyte is likely to run under a minute. If you want to try Logica on datasets of a few gigabytes then Google BigQuery free quota of 1TB would be sufficient for quite a few experimental runs. A lot of functionality is already supported for PostgreSQL, which makes it possible to run queries inside a standard PC. In this case the data would be stored on the local disk and accessed as needed as well. Please let me know if you have further questions. |
Beta Was this translation helpful? Give feedback.
Thanks for your question!
That's correct! Logica delegates execution of the program to a database engine, which means data can be stored on disks and is accessed as needed.
BigQuery, which is currently the only fully-supprted Logica's back-end, is Google Datawarehouse which is capable executing heavy queries over datasets of hundreds of terabytes. In this case the data is stored in a distributed file system. A reasonably complex logic program over a dataset of under 1 terabyte is likely to run under a minute.
If you want to try Logica on datasets of a few gigabytes then Google BigQuery free quota of 1TB would be sufficient for quite a few experimental runs.
A lot of functionality is alrea…