Every time you shop online, every time you sign up for a newsletter, or register on a website, or enquire about a new car, or fill out a warranty card, or buy a new home, or register to vote - you are unwittingly handing over a small clue as to who you are and how you behave.
— Hannah Fry
History is littered with examples of objects and inventions with a power beyond their professed purpose. Sometimes it's deliberately and maliciously factored into their design, but at other times, it's a result of thoughtless omissions.
A century ago the Spanish flu confounded scientists and devastated whole regions, but while today's society has air travel and an enormous, heterogeneous population, we also have antibiotics, fantastic communication networks and, perhaps most crucially, more data than ever.
I'm a mathematician. I can trade in facts about false positives and absolute truths about accuracy and statistics with complete confidence.
Our long-range predictions - especially those which anticipate extreme-weather events - rely on an assumption that the future will be similar to the past. Lose that, and we lose the tools that have allowed us to prepare for such eventualities.
Wind depends on temperature. Temperature depends on pressure. And pressure depends on wind. It's an intricate mathematical tapestry that is far too intertwined to unpick by hand.
All around us, algorithms provide a kind of convenient source of authority: an easy way to delegate responsibility, a short cut we take without thinking.
You can't assess the value of an innovation in isolation, you have to consider whose hands it's in.
When it comes to love, making long-term decisions is a risky business. Sooner or later, most of us decide to leave our carefree bachelor or bachelorette days behind us and settle down.
We literally hand over our most private data, our DNA, but we're not just consenting for ourselves, we are consenting for our children, and our children's children. Maybe we don't live in a world where people are genetically discriminated against now, but who's to say in 100 years that we won't?
Because for me, equations and symbols aren't just a thing. They're a voice that speaks out about the incredible richness of nature and the startling simplicity in the patterns that twist and turn and warp and evolve all around us, from how the world works to how we behave.
Human emotion isn't neatly ordered and rational and easily predictable.
We have this imbalance where the people who are making algorithms aren't talking to the people who are using them. And the people who are using them aren't talking to the people who are having decisions made about their lives by them.
I certainly think there are some skills we'll lose as we hand things over to automation. I can barely remember my own phone number now, let alone the long list of numbers I used to know, and my handwriting has completely gone to pot.
I do think we're at a point in our history where almost all of the big, grand, challenges faced by the human race are those that demand a scientific solution: climate change; access to clean water; over-crowding; plastic waste.
I spend quite a lot of time thinking about how curated our information is. What we watch, what we read, what we buy, often who we talk to, is all shaped and influenced by some kind of a mathematical algorithm.
And anytime a programmer makes a decision about how to deal with data, how to average it or clean it, you're imparting more of your own bias on it.
As the law catches up and the battle between corporate profits and social good plays out, we need to be careful not to be lulled into a false sense of privacy.
At some point in the future - possibly the very near future - Britain will be hit by a deadly pandemic, and its impact could be utterly devastating.
In every community, there are a number of 'social super-spreaders' among us. Long-suspected and emphatically confirmed by our data, these are people who - through dint of their job, or lifestyle, or perhaps even genetic makeup - would be more dangerous in the instance of a pandemic than the average person.
On average, the higher the novelty score a film had, the better it did at the box office. But only up to a point. Push past that novelty threshold, and there's a precipice; the revenue earned by a film fell off a cliff.
The weather doesn't respect political or geographic boundaries: we're all living under the same sky. And so weather prediction has been a marvel not only of technology but also of international cooeperation.
Because, ultimately, we can't just think of algorithms in isolation. We have to think of the failings of the people who design them - and the danger to those they are supposedly designed to serve.
In our urge to automate, in our eagerness to adopt the latest innovations, we appear to have developed a habit of unthinkingly handing over power to machines.
Technology on its own isn't good or or evil.
One of the problems maths struggles with is that it's invisible. We haven't got explosions on our side.
In medicine, you learn about ethics from day one. In mathematics, it's a bolt-on at best. It has to be there from day one and at the forefront of your mind in every step you take.
The best couples, or the most successful couples, are the ones with a really low negativity threshold. These are the couples that don't let anything go unnoticed and allow each other some room to complain.
I don't like the algorithms that don't do what they claim they do.
You can harvest any data that you want, on anybody. You can infer any data that you like, and you can use it to manipulate them in any way that you choose. And you can roll out an algorithm that genuinely makes massive differences to people's lives, both good and bad, without any checks and balances.
Every criminal-justice system has to find some kind of balance between protecting the rights of innocent people falsely accused of crimes and protecting the victims of crimes.
When you don't have diversity in the creative process, you inevitably end up with a single, narrow perspective in the output.
Algorithms and data should support the human decision, not replace it.
When designing algorithms as a business owner, your incentive is your profit, something for your business, it's not an incentive to maximise something for the individual.
The invisible pieces of code that form the gears and cogs of the modern machine age, algorithms have given the world everything from social media feeds to search engines and satellite navigation to music recommendation systems.
But the threat of a pandemic is different from that of a nerve agent, in that a disease can spread uncontrollably, long after the first carrier has succumbed.
But for me, true art can't be created by accident. There are boundaries to the reach of algorithms. Limits to what can be quantified. Among all of the staggeringly impressive, mindboggling things that data and statistics can tell me, how it feels to be human isn't one of them.
Whenever we haven't got enough information to make decisions for ourselves, we have a habit of copying the behaviour of those around us.
No weather forecaster can tell you for sure when to wear a rain slicker, stock up on canned goods, or evacuate a city that's in a cyclone's path. All forecasters can offer is their best guess at the atmosphere of the future, whispered by the simulated blue marble and wrapped up in uncertainty.
If we permit flawed machines to make life-changing decisions on our behalf - by allowing them to pinpoint a murder suspect, to diagnose a condition or take over the wheel of a car - we have to think carefully about what happens when things go wrong.
We should actively be thinking about what our inventions would look like if exploited by someone with a less of a moral compass and decide if the world would really be better off with them in it.
We're living in an age where new technology offers gigantic upsides - artificial intelligence has the potential to diagnose cancer, catch serial killers and reduce prison populations.
The future doesn't just happen. We are building it, and we are building it all the time.
It's true that you can't take an individual rain droplet and say where it's come from or where it's going to end up. But you can say with pretty good certainty whether it will be cloudy tomorrow.
So my favorite online dating website is OkCupid, not least because it was started by a group of mathematicians.
But as soon as Facebook decided that they wanted to become purveyors of news, suddenly you have these highly personalized newsfeeds where everything is based on what your friends like, what you like, things that you've read in the past.
People are often quite lazy. We like taking the easy way out - we like handing over responsibility, we like being offered shortcuts that mean we don't have to think.
Once you see the problems that algorithms can introduce, people can be quick to want to throw them away altogether and think the situation would be resolved by sticking to human decisions until the algorithms are better.
What seems obvious to one person wouldn't occur to another. Your perspective is hard coded into the work that you create.
Imagine life without any algorithms at all, you wouldn't be able to do anything. This is already completely encompassing. We have a habit of over-trusting what mathematics or computer scientists tell us to do, without questioning it, too much faith in the magical power of analysis.