raspar (main) doxdox documentation

A simple to use Promise-based web scraper with local caching.

# fetch(url, options, options.cacheDirectory, options.requestOptions, options.ttl)

Request content from URL.

Parameters

Name Types Description
url String A URL string.
options Object Options object.
options.cacheDirectory String Directory to store cache. Default is temp/cache/.
options.requestOptions Object Custom request options object. Default is {}.
options.ttl Number TTL (Time to live) in seconds. Default is 1800

Returns

Object

Contents of request.

# checkCacheExpiry(path, ttl)

Check to see if file has expired based on the given TTL.

Parameters

Name Types Description
path String File path to check.
ttl Number TTL (Time to live) in seconds.

Returns

Object

Promise

# generateUUID(content)

Generate unique identifier from string.

Parameters

Name Types Description
content String String to generate unique identifier.

Returns

String

# readCache(path, ttl)

Read cache from file only if cache hasn't expired.

Parameters

Name Types Description
path String File path to read cache.
ttl Number TTL (Time to live) in seconds.

Returns

Object

Promise

# writeCache(path, content)

Write cache contents to file. Will create directories if they don't exist.

Parameters

Name Types Description
path String File path to store cache.
content String Contents of cache.

Returns

Object

Promise