Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
These are nice for debugging, and sanity checking for graphs that we have control over, but aren't suitable for converting programs from "untrusted" sources. E.g. a compiler may choose
MATLAB tool for building various unweighted and weighted graph representations of Public Transport Networks using GTFS data for complex network analysis
This is the implementation of 2nd Part in 3-Part Series of Algorithms Illuminated Book. All Implementations in this repository are written in both Python and Golang. Single IPython Notebook contains all Algorithms given in this Part 2.
The program builder runs some checks to make sure that graphs are "valid":
https://github.com/ChrisCummins/ProGraML/blob/db8ae633ec02deabaa70e21b88b843dce3d05ba1/programl/graph/program_graph_builder.cc#L147-L164
These are nice for debugging, and sanity checking for graphs that we have control over, but aren't suitable for converting programs from "untrusted" sources. E.g. a compiler may choose