2570
Comment:
|
4381
|
Deletions are marked like this. | Additions are marked like this. |
Line 6: | Line 6: |
Question answering from Freebase as described in the [[http://ad-publications.informatik.uni-freiburg.de/CIKM_freebase_qa_BH_2015.pdf|CIKM 2015]] publication. The code below also contains some improvements (neural network, performance) that came after the publication. | Question answering from Freebase as described in the [[http://ad-publications.informatik.uni-freiburg.de/CIKM_freebase_qa_BH_2015.pdf|CIKM 2015]] publication. The code below also contains some improvements (neural network, performance) that came after the publication. The public code also contains a README that describes how to download, install, train and run the system. Below describes how to setup the demo (for which the code is not public). |
Line 25: | Line 25: |
ssh metropolis | |
Line 28: | Line 29: |
PYTHONPATH=$(pwd):$PYTHONPATH python webserver/translation_webserver.py |
PYTHONPATH=$(pwd):$PYTHONPATH python webserver/translation_webserver.py |
Line 36: | Line 36: |
Start it as follows (kill old process manually): | Start virtuoso as follows. Server listens on port 8999. (Alternatively follow the instructions in QUICKSTART.md to download the instance, install virtuoso and start it.) {{{ ssh metropolis sudo su haussmae cd /home/haussmae/keyword-translation make start-virtuoso-mini-cai }}} Start the HTTP proxy as follows (kill old process manually, listens on port 9000): |
Line 45: | Line 55: |
== How to update Virtuoso with custom data === | === Parser === Aqqu uses a parser to get part-of-speech tags of query words. The parser is accessed via HTTP Api calls. To start the parser server: {{{ ssh metropolis sudo su haussmae cd /home/haussmae/keyword-translation make start-parser }}} The port is configured in the corenlp-frontent/build.xml. The API can be accessed like this: http://metropolis.informatik.uni-freiburg.de:4000/parse/?text=This%20is%20a%20test%20sentence. === Run the new Aqqu version (with NN) === Start on titan (requires GPU): {{{ ssh titan sudo su haussmae cd /home/haussmae/aqqu-bitbucket source activate aqqu PYTHONPATH=$(pwd):$PYTHONPATH python webserver/translation_webserver.py }}} Runs on port 5454 on titan now. However, titan is not available from outside the uni network. To start a port-forwarding from metropolis: {{{ ssh metropolis sudo su haussmae cd /home/haussmae/temp/nc/python-port-forwardt python2 port-forward.py }}} The service is now available on metropolis:5454 === How to update (any) Virtuoso with custom data === |
Line 52: | Line 102: |
grant execute on SPARQL_INSERT_DICT_CONTENT to "SPARQL”; | grant execute on SPARQL_INSERT_DICT_CONTENT to "SPARQL"; |
Line 54: | Line 104: |
grant execute on SPARQL_DELETE_DICT_CONTENT to "SPARQL”; | grant execute on SPARQL_DELETE_DICT_CONTENT to "SPARQL"; |
Line 81: | Line 131: |
== Data == All of the required data to run Aqqu is part of the materials (see above). It is located in the ''data'' subfolder. The scripts to create this data are part of the (old) keyword-translation repository: [[https://bitbucket.org/onekonek/keyword-translation]] |
Aqqu
Description
Question answering from Freebase as described in the CIKM 2015 publication. The code below also contains some improvements (neural network, performance) that came after the publication. The public code also contains a README that describes how to download, install, train and run the system. Below describes how to setup the demo (for which the code is not public).
Code
Public GitHub repository: https://github.com/elmar-haussmann/aqqu .
Internal git repository (contains work after publication, mainly neural net and performance improvements): https://bitbucket.org/elmar-haussmann/aqqu .
Internal git repository for the web-UI (we didn't put that public): https://bitbucket.org/elmar-haussmann/aqqu-webserver .
Demo
Aqqu instance
2016-06-30: runs under http://metropolis.informatik.uni-freiburg.de:5455
Start as follows on metropolis:
ssh metropolis sudo su haussmae cd /home/haussmae/demos/aqqu-demo source activate aqqu PYTHONPATH=$(pwd):$PYTHONPATH python webserver/translation_webserver.py
Virtuoso instance
2016-06-30: Virtuoso instance for Aqqu runs unter http://metropolis.informatik.uni-freiburg.de:9000/sparql .
Start virtuoso as follows. Server listens on port 8999. (Alternatively follow the instructions in QUICKSTART.md to download the instance, install virtuoso and start it.)
ssh metropolis sudo su haussmae cd /home/haussmae/keyword-translation make start-virtuoso-mini-cai
Start the HTTP proxy as follows (kill old process manually, listens on port 9000):
ssh metropolis sudo su haussmae cd /home/haussmae/keyword-translation make start-varnish
Parser
Aqqu uses a parser to get part-of-speech tags of query words. The parser is accessed via HTTP Api calls. To start the parser server:
ssh metropolis sudo su haussmae cd /home/haussmae/keyword-translation make start-parser
The port is configured in the corenlp-frontent/build.xml. The API can be accessed like this: http://metropolis.informatik.uni-freiburg.de:4000/parse/?text=This%20is%20a%20test%20sentence.
Run the new Aqqu version (with NN)
Start on titan (requires GPU):
ssh titan sudo su haussmae cd /home/haussmae/aqqu-bitbucket source activate aqqu PYTHONPATH=$(pwd):$PYTHONPATH python webserver/translation_webserver.py
Runs on port 5454 on titan now. However, titan is not available from outside the uni network.
To start a port-forwarding from metropolis:
ssh metropolis sudo su haussmae cd /home/haussmae/temp/nc/python-port-forwardt python2 port-forward.py
The service is now available on metropolis:5454
How to update (any) Virtuoso with custom data
Grant access rights via the ISQL tool as follows:
data/virtuoso/install/bin/isql localhost:1112 dba dba grant execute on SPARQL_INSERT_DICT_CONTENT to SPARQL_UPDATE; grant execute on SPARQL_INSERT_DICT_CONTENT to "SPARQL"; grant execute on SPARQL_DELETE_DICT_CONTENT to SPARQL_UPDATE; grant execute on SPARQL_DELETE_DICT_CONTENT to "SPARQL";
Complex example SPARQL query from "Programmieren in C++, SS 2016, Ü10 (all action or animation movie with their release date, genre, director, production company, and rating):
PREFIX fb: <http://rdf.freebase.com/ns/> SELECT DISTINCT ?fn, ?y, ?gn, ?dn, ?pn, ?rn where { ?f fb:type.object.type fb:film.film . ?f fb:film.film.initial_release_date ?y . ?f fb:film.film.genre ?g . ?f fb:film.film.directed_by ?d . ?f fb:film.film.production_companies ?p . ?f fb:film.film.rating ?r . ?f fb:type.object.name ?fn . ?g fb:type.object.name ?gn . ?d fb:type.object.name ?dn . ?p fb:type.object.name ?pn . ?r fb:type.object.name ?rn FILTER(lang(?fn)='en') FILTER(lang(?gn)='en') FILTER(lang(?dn)='en') FILTER(lang(?pn)='en') FILTER(lang(?rn)='en') FILTER(?gn='Action Film'@en OR ?gn='Animation'@en) }
Data
All of the required data to run Aqqu is part of the materials (see above). It is located in the data subfolder. The scripts to create this data are part of the (old) keyword-translation repository: https://bitbucket.org/onekonek/keyword-translation