This is a great, easy read by an experienced author. And for anyone eagerly watching what's happening with technology and AI particularly, it's really thought-provoking.

As the author says, it's human nature to build trust with other human beings. And our default as a species is to look for honesty, empathy, loyalty and great service. How far can AI go when it comes to emulating what is instinctive and important to us? Is there a risk that over time, we can simply be 'programmed' to evolve beyond our instinctive attachment to these values and needs?

I'd like to think that there's room for both man and machine in our future. It's not like there isn't plenty of evidence already of this being a reasonable expectation. After all, in response to just about every technological innovation, from the industrial revolution throughout our modern history, man has flourished, jobs and economies have evolved and thrived. Evolution is something that as a species, we're pretty good at. So it seems a little melodramatic to anticipate that AI, as just the latest and no doubt not the last, of the technological innovations for which mankind is responsible, should suddenly send us into an economic and existential melt-down.