Lucida: The Truly Personal Assistant
About a year back I wrote a small bit about Sirius, the open source intelligent personal assistant. The Sirius project has evolved and undergone a name change to Lucida, no doubt due to the extreme similarity to the name of a certain fruit-based company’s personal assistant. The major draw of the Lucida / Sirius project is the opportunity to have an AI assistant that doesn’t rely on “the cloud” and thereby the intrinsic requirement of entrusting a slew of your information (geographic, date, time, search data, contextual and related searches, etc.).
I’ve long been ironically reticent to entrust the kind of information these services require to an outside agency who stands to profit from selling said information to advertisers or using it internally to examine more closely how to either get yet more data out of me or sell their own goods and services to me. I say that it’s ironic for the express reason that I work in the IT field and advocate for advancements in AI technology whenever I can.
If you happen to listen to the Citizens of Tech podcast (which I co-host with my good friend Ethan Banks) you have no doubt heard us extol the awesomeness of the latest advances in artificial intelligence.
Where I diverge from the “all the AI, any way we can get it!!!!” mentality is that I would very much prefer to use my own private cloud to power the back end of said artificial intelligence and this is where Lucida opens doors of opportunity for just such advancements. You may or may not recall that I went so far as to set up my own private cloud storage service quite some time ago – which still meets all my needs and keeps my data strictly well, mine. I am a major advocate of owning your own data whenever that is feasible. If that costs me some initial setup overhead, I’m A-OK with that.
Roll your own Lucida instance
You can find the new site at lucida.ai where there is now instructions on setting up Lucida inside a VirtualBox VM – simply download Virtualbox, download the Djinn vbox file, configure the virtual network ports and fire up the docker session.
This certainly makes taking the AI “for a spin” much simpler than having to compile the code and set up all the software prerequisites on a dedicate Linux box (or VM).
This has definitely been added to my to-do project list, and when I do get around to it I’ll be certain to post my findings here. In the mean time, if you beat me to it, please let me know how it went!