-
Notifications
You must be signed in to change notification settings - Fork 15
Add station names to codes mapping #18
base: master
Are you sure you want to change the base?
Conversation
|
sorry for the comment . i didnt noticed it actually will test and tell you |
|
but making such large dict/json will effect perfomance issue...? |
|
There are 3 common persistence solutions in Python that can be used - The I browsed a number of posts and checked the performance statistics over the years. This StackOverflow Answer is the latest one for that question regarding pickle vs json performance. It seems that json used to be faster for Now, whether to use If whatever "fuzzy search" that is planned to be implemented can be done with a simple SQL Query, then I can update the PR with a suitable file depending on your choice. |
|
i think sqlite3 will be the better choice otherwise if entire dict loaded as object it will effect the perfomance i think .....also can you join this group for more discussions with other members... https://t.me/RailgadiBotDiscussions |
Add a SQLite database with a table `station_codes` Refers #17
|
Updated the PR. Switched from the JSON file to using a SQLite database. The name of the database is Documentation for using Python 3 's sqlite3 module : sqlite3 The added advantage of using a SQLite database is that we will be able to filter the station names based on current user input using wildcard characters and the For example if the user is typing "Lucknow" : This will be useful when there are many stations within the same city, as with a lot of larger ones. |
|
@prinzpiuz can this be merged now? |
|
need to be tested and need to make bot inline also then only i think this feature works... everybody in group is busy i think |
|
let me make inline commands |
Refers #17
Station name to code mapping added as per an official Indian Railways document here. It contains 711 stations total.
While other sources list up to 8000 stations, those also seem to include goods stations where passenger trains don't usually stop. Nonetheless, I will still be scraping them and updating the file in the future.