You do realize that the chances of a programmable entity with a dependency on a central power source is unlikely to actually gain consciousness in the same way as a human, regardless of the advances in AI technology that can take common themes and produce similar responses through algorithms, right?
And I am not sticking with Matthew if the world goes to Hell. Matt will take off with his military buddies, and I am not built for that life. I will stay in a central hub with intellectuals in an attempt to salvage information and education to redistribute to those who survive the first two years.
I have a feeling that just puts eyes on Sonya. And she needs to make it through college before mother starts marrying her off like a Jane Austen novel.