SQLite

Questions & Answers about Helium Scraper 3
Post Reply
BalooBear
Posts: 4
Joined: Thu Sep 12, 2013 1:28 pm

SQLite

Post by BalooBear » Fri May 15, 2020 10:04 am

Finally getting to grips with the new program and I think it's great. :D

My scrape process can take a long time and during that time I can encounter OS/Internet errors outside the control of the HSS script running.

Sometimes if the script crashes, I lose all my data scraped (ouch :x ). Additionally I want to process some of the data while the scrape is still going, but when I examine the SQLite file the latest data has not been committed yet.

I there a way to "flush" the data at regular intervals or programmatically? Currently I have to stop the scrape, save the file and only then the journal file commits the data (I could be wrong on this).

Also, another very minor problem that you may know a work around for is that Access doesn't like "." in the table names. So I cannot directly link/import from them. My current workaround is to create another table in SQLite with a table name access can handle, but it would be great if I could control the naming convention of the tables at creation time or by some other method?

Looking forward to your response.

webmaster
Site Admin
Posts: 521
Joined: Mon Dec 06, 2010 8:39 am
Contact:

Re: SQLite

Post by webmaster » Tue May 19, 2020 10:31 pm

There's no way to automatically flush the data, but you can save the project with File -> Save while it's still running without having to stop the extraction.

Regarding the dot, not 100% sure about this but I think you can use brackets like "[Some.Thing]" in Access. Anyway, it'd make sense to be able to use a different separator. I'll think about it see how feasible this option is.
Juan Soldi
The Helium Scraper Team

BalooBear
Posts: 4
Joined: Thu Sep 12, 2013 1:28 pm

Re: SQLite

Post by BalooBear » Wed May 20, 2020 9:59 am

Thanks...

Post Reply