Oonai was first conceived as an extension to the chora data manager layer.
Nevertheless, it can be used as a standalone software to provide helpful services for text analysis/classification through http(s) for your applications.
Feed Oonai with some text and it will allow you to:
In order to achieve its purpose, Oonai relies on some popular packages such as: TextBlob (Naive Bayes Algorithm), spaCy (Neural Network Architecture, see https://spacy.io/api/#nn-models), the Natural Language Toolkit, Stanford CoreNLP or the amazing dbacl project (a digramic Bayesian classifier).
Some of Oonai functions require temporary data files (for audio output, text samples and models) and will consume disk space: Some of these files can be generated from chora by loading the collections/setup_servetext method. This will generate the dictionaries of your collections and datatypes (preferably without stop words).
Oonai services can be accessed through http (or https if you provide certificate files) with a very simple API reminiscent of RESTful:
—The parameters are passed with GET/POST and should be properly url encoded.
—The responses are returned in JSON.
It is usually easier to install Oonai on the same system that hosts chora or your own web application: if you want to use another host, like some dedicated hardware (such as a Nvidia Jetson Nano), you'll need to synchronize collections/data files generated from chora into oonai's installation directory: this may be accomplished through SSH, rsync or FTP, for instance: we can provide such tools. Of course, if you plan to use Oonai as a standalone software, you'll need to generate some files by your own means (have a look at the provided samples to see how your data must be formatted...)
Install Python 3.6+ with support for sqlite3, pyaudio and pocketsphinx.
For instance, on a FreeBSD System you may use:
pkg inst python36 py36-sqlite3 py36-pyaudio pocketsphinx
For installing the rest of dependencies, you'll need pip (more precisely, pip3 for Python 3.x).
python3 -m pip install -r requirements.txt
Note: OonaI has its own standalone http layer to serve files: web.py 0.40.dev1 or latest for Python 3 will be installed.
Optionally, compile servetext.py for quick launch:
python3 -m py_compile /usr/local/www/webapp/binaries/servetext/servetext.py
Download, Compile and install the binaries of the dbacl project --see included Documentation
Generate the appropriate datafiles from chora by loading: http://path/to/chora/collections/setup_servetext in your browser
For detailed instructions, check the README file.
Start oonai in a terminal with:
/usr/local/www/webapp/binaries/servetext/servetext.py 8880 (Use fullpath!)
or as a detached process that will run in background with:
To detach oonai from the terminal, you may also use screen:
screen /usr/local/www/webapp/binaries/servetext/servetext.py 8880 &
This is actually as simple as pointing your browser to the appropriate url in order to call one of the available methods.
Here are some samples:
For a comprehensive list of Oonai functions, have a look at the Oonai functions document.
Simply kill Oonai Parent Process (retrieve its PID with pgrep -n -f servetext.py) or execute the provided script:
Ce document a été publié le 2019-07-18 12:18:39. (Dernière mise à jour : 2020-02-29 08:08:39.)