These scripts can be used to inspect matchings generated by the map matching plugin for OSRM.
There are two ways to install the server side component. One involves building osrm-backend and node-osrm from source and one needs binary packges of a special node-osrm version.
npm install osrm
Which downloads a binary package if it is available on your platform.
Compile OSRM with debug support:
git clone https://github.com/Project-OSRM/osrm-backend.git
cd osrm-backend
mkdir -p build
cd build
cmake -DENABLE_JSON_LOGGING=1 ..
make && sudo make install
Run:
npm install osrm --build-from-source
To download and compile node-osrm from source.
Run:
bower install && make
To install the front-end components.
To import traces in GPX or CSV format contained in a folder data to the labeling database run:
node bin/server.js data
This will create a file data/clasification_db.sqlite which will contain a list of all traces and their classification.
Assuming your GPX traces are contained in a folder data in the current repository root:
Locally run:
node bin/server.js data path/to/dataset.osrm
Alternatively if you want to use osrm-routed instead of node-osrm just run:
node bin/server.js data
Which expects a osrm-routed server listening on http://127.0.0.1:5000.
Now you can view the frontend on http://127.0.0.1:8337 in your browser. It will look somewhat like this:
Which shows an interactive trellis diagram of the matching. Select a state pair to view the transition probabilities and Viterbi values.
You can use the left and right arrow key to cycle through the traces.
Opening http://127.0.0.1:8337/classify.html will display a minimal interface for easy classification.
Pressing 0 will classify as unknown, 1 as valid and 2 as invalid.
The labels will be saved in classification_db.sqlite which can be used by bin/test_classification.js to verify the classifier
implemented inside the OSRM plugin.
bin/test_classification.js will also generate tested_db.json which is needed by bin/calibrate_classification.py to derive better classification values.
You can batch match a dataset using bin/traces2geojson.js data > matched.geojson which produces a geojson file containg the following features:
For every sub-matchings:
{
"type": "Feature",
"geometry": {
"type": "LineString",
"coordinates": [[lon, lat], ...],
},
"properties": {
"type": "matching"
file: "path/to/file"
confidence: 0.5 # in [0, 1] -> 1 means very confident, 0 means no confidence
}
}For every trace fragment:
{
"type": "Feature",
"geometry": {
"type": "LineString",
"coordinates": [[lon, lat], ...],
},
"properties": {
"type": "trace"
file: "path/to/file"
confidence: 0.5 # confidence of the corresponding sub-trace
}
}