Book cover for The Dream Hotel

This book would have been incomprehensible in the 20th century. While science fiction books frequently discussed surveillance and government overreach, these abuses of power were the sole prerogative of states. Yet, we find ourselves in a different predicament today: Laila Lalami’s book, which points to a world of privatized surveillance and data-collection systems, “prisons” generating revenue, botched evacuations in the wake of climate catastrophes, and racially profiling in airports.

In other words, the book is hardly “science fiction” at all: it hits too close to home; it is too real.

Science fiction is never about the future, although this is how we most often think about it. Rather, it is about the present, especially humanity’s relationship with science and technology. It reflects our anxieties and fears about the world.

The Dream Hotel is no exception to this. Laila Lalami’s novel tells the story of a young mother who is stopped at an airport and brought to a “temporary retention center” because an algorithm–which predicts likelihood of future crimes–has labeled her a high risk. In theory, the “retained” are supposed to stay at the center for three weeks and be released. However, various bureaucratic inefficiencies, force majeures, and petty abuses of power ensure that most “residents” stay much longer: our protagonist stays for eleven months.

You could substitute out fictional names for real ones, and the book would not read any differently. The company that mines peoples’ dreams (and is attempting to insert ads into them) may as well be Google, the company that manages the “retention centers” might be CoreCivic, and it isn’t hard to see Facebook as the company that manages communication systems.

The world Lalami depicts is the logical conclusion of surveillance capitalism: it doesn’t matter whether we committed crimes or not–it doesn’t matter if we have our own private thoughts. Any individuality that we might have can be boiled down to a series of a few thousand data points, which have predictive capacity. Our private drives and disputes can be used against us, and the data only looks for that which confirms its already-existing conclusions.

Is this reality? I suspect that it is not, but it is the vision of tech bros everywhere.

Such a realization, more than anything else, is chilling.