This library aims to be a drop-in replacement for the Python scraperwiki library
for use locally. That is, functions will work the same way, and data will go
into a local SQLite database; a targeted bombing of ScraperWiki's servers
will not stop this local library from working, unless you happen to be running
it on one of ScraperWiki's servers.
This will soon be in PyPI, but for now you can just install from the git repository.
Read the standard ScraperWiki Python library's documentation, then look below for some quirks about the local version.
The local library aims to be a drop-in replacement. In reality, the local version sometimes works better, though not all of the features have been implemented.
The local scraperwiki.sqlite is powered by
DumpTruck, so some things
work a bit differently.
Data is stored to a local sqlite database named scraperwiki.sqlite.
Bizarre table and column names are supported.
scraperwiki.sqlite.execute will return an empty list of keys on an
empty select statement result.
scraperwiki.sqlite.attach downloads the whole datastore from ScraperWiki,
So you might not want to use this too often on large databases.
In general, features that have not been implemented raise a NotImplementedError.
scraperwiki.sqlite is missing the following features.
- All of the
verbosekeyword arguments (These control what is printed on the ScraperWiki code editor)
The UK geocoding helpers (scraperwiki.geo) documented on scraperwiki.com have been implemented. They partially depend on scraperwiki.com being available.
scraperwiki.utils is implemented, as well as the following functions.
scraperwiki.logscraperwiki.scrapescraperwiki.pdftoxmlscraperwiki.swimport
These submodules are deprecated and thus will not be implemented.
scraperwiki.apiwrapperscraperwiki.datastorescraperwiki.jsqlitescraperwiki.metadatascraperwiki.newsql
Run tests with ./runtests; this small wrapper cleans up after itself.