So back home we have something called Virtual Intelligences. Basically what some of them do is be able to handle search queries over the extranet, or act as tour guides. They even have personalities, but sadly aren't sentient. There's another version that are tied to armor upgrades, and basically allow your combat suit to optomize your combat performance, or dispense medi-gel when it's needed. Which; obviously isn't here yet but one of it's precursors happens to be something called a conversational user interface (or more easily pronounced: CUI).
Now you may ask, what exactly is that? Well you might know it as two things; chatbots or voice assistants. That's right every time you ask your phone something it will (eventually) lead into being able to ask a Virtual Assistant where a particular store is, or hey why
aren't there any real fish in the Citadel waters. Now how a CUI actually works is really interesting. You see to understand the questions you give it, a CUI uses something called natural language processing which more or less allows a bunch of code to understand, and create meaning from human language. Which, we all take asking our devices things for granted but that's actually pretty cool. However there's still a few hiccups. Obviously human language is really ambiguous, so sometimes the CUI has trouble understanding us. But! It will get there.
This is already being shown because there is already a shift towards natural-language understanding, which allows a CUI to have sentiment analysis and lets it follow a line of questioning that in turn lets it figure out context. So for instance if you asked your device for the population of a particular place, and then followed it up with who was in charge there, the NLU could carry the context further on that and give you the answer.
Neat, huh!