Daily news reports about how the markets have reacted to specific events are now commonplace. The fake tweet saying Obama had been hurt in an explosion shook the markets, as did the Boston bombing. The reaction was instant: millions were lost. The trouble is, most trades on the stock market are now conducted by algorithms running on super-computers. It’s not human beings making the decisions any more; A.I. is being used to both read and write the news, which then feeds and directs the A.I. economy, over which we have little control.
The A.I. algorithms are constantly mining the Web, to analyse social media, blogs, and real-time news feeds, to predict the markets; in fact, they price the news.
The computers have been programmed to interpret the news, as quickly as A.I. possible, in order to get the edge on trading. The race is on – to find the latest technology to gain a microsecond advantage over competitors. Drones and algorithms are now used to source and write the news reports, but could it go even further? Could some of our daily news be covering events that were planned to happen, just to get that edge?
The algorithmic ecosystem
Wall Street has changed a lot: the shouting commotion of the stock market has been replaced with the eerie hum of hundreds of computers, housed in large, temperature-controlled rooms. Each computer is running proprietary algorithms, trading at lightning-fast speed.
The algorithms these computers operate are so fast that what they do cannot be understood by humans. We’re talking thousands of trades per minute.
We’re talking very clever computers that can understand the news and communicate with each other. It’s a game: bids are made, and then retracted, hundreds of times per second, in an effort to fool the other algorithms.
They also react to each other, so there are algorithms to try and fool other algorithms, (and create noise in the system), and yet others which hunt them down, called sharks. Then of course there are yet more algorithms to look for the sharks, and so it goes on. It’s wild in there. The law can’t touch it, because it’s too complicated.
The very best physicists and mathematicians are the only ones who understand the algorithmic programs they’ve devised (they’re known as ‘quants’). The ones with the most money have the best chance of coming up with the most advanced, and capable, algorithms which trump the rest, and of employing specialist “high frequency traders on news(HFTNs) … to collect, process and exploit news faster than other market participants.”
According to the New York Times,
Math-loving traders are using powerful computers to speed-read news reports, editorials, company Web sites, blog posts and even Twitter messages — and then letting the machines decide what it all means for the markets.
The development goes far beyond standard digital fare like most-read and e-mailed lists. In some cases, the computers are actually parsing writers’ words, sentence structure, even the odd emoticon. A wink and a smile — 😉 — for instance, just might mean things are looking up for the markets. Then, often without human intervention, the programs are interpreting that news and trading on it.
Trading on the news
The key issues with high frequency trading are speed, specialist knowledge, and the money to pay for super computers and privileged services.
It takes 500,000 microseconds for a person to just click a mouse but in this game, traders are paying through the nose to gain even a microsecond advantage over competitors, because being only five microseconds behind the leaders means losing. This has forced traders to ‘co-locate’ to be near the exchange; the shorter the cable, the better. It’s all about, “the time it takes to transmit the trade orders to the electronic exchange. Speed to market is a major competitive advantage. The HFT that gets a trade to the exchange a microsecond faster than a competitor wins.”
Millions have been spent on acquiring sites which are close to the NYSE, but now the leaders of the pack are actually housed in the stock exchange itself. The latest thing is to employ new technology to cover the distance between stock exchanges.
Getting the news
Another way to acquire a competitive edge in the markets is to be the one with access to the primary source for the news, which comes from private data providers, and includes press releases, government statistics, and business news. A two-tiered system has emerged, where certain companies can pay to get their news faster than (almost) everyone else.
The potential beneficiaries include news delivery services such as Thomson Reuters, Bloomberg and News Corp. unit Dow Jones & Co., the publisher of Dow Jones Newswires and The Wall Street Journal. Also winning side: data center operators such as AT&T Co. and the trading firms themselves.
A company called Anova Technologies provides the latest technology to make the speed of news transmission the best it could ever be; aspiring to supply “the end-game route”, Anova uses millimetre microwaves to reduce the distance covered to, “just 2.1 kilometres longer than the theoretical possible set by the curvature of the earth.”
The company has now teamed with AOptix to offer a hybrid service which incorporates laser technology, to ensure the fastest speeds possible, whatever the weather.
Reporting the news
Drones are also being exploited for news; there are plans to use them as microwave relay stations to span the Atlantic, as well as to acquire news in the first place. Mini spies will buzz in the skies, to gather and transmit data before others can. Drone journalism is now being taught at the University of Missouri’s School of Journalism, where students learn how to fly drones to carry out investigative journalism. The university has partnered with local NPR affiliate KBIA, who contributed $25,000 to build the drones for the course.
There is also a Drone Journalism Lab at the University of Nebraska-Lincoln, which, “teaches students how to operate unmanned autonomous vehicles (UAVs) and how to interpret the footage and the ethics using the military robots, as well FAA regulations”. The idea being, as with all drone use, that it is safer than sending in a human.
It’s also cheaper for the news agencies than flying helicopters and paying staff. Besides, according to Harley Geiger, a policy attorney with the Center for Democracy and Technology in Washington D.C.,
Some of the privacy issues that we see with drones are very different than the sort of surveillance that can be conducted with a helicopter. Drones can quietly watch an entire town without refueling. It can conduct a pervasive and secret surveillance that helicopters cannot match.
(Watch out for the nano humming birds, funded by DARPA! They can fly inside or outside, forwards, backwards, or hover.)
Understanding the news
Another important aspect to creating the news is understanding the sentiment of the market; advertisers, researchers, governments, corporations, law enforcers, and the military, regularly mine social media and news articles to gauge the overall mood of the public, and therefore how they will react to a given stimulus. Thomson Reuters News Analytics uses Lexalytics’ natural language processing system to process and score text for:
- Author sentiment
- Volume analysis
- Headline analysis
Sentiment analysis is fundamental to understanding the markets; it has always been the mood of the markets which drives trade on the stock exchanges. But now, sentiment analytics can be used to track and predict the Dow Jones on a daily basis with 88% accuracy, i.e. there is a correlation between the two.
Companies such as Lexalytics can analyse data for content and meaning, looking for keywords in headlines, such as ‘rumour’, or ‘panic’, and especially for information released by the government, police, and corporates. However, texts can be ambiguous, so programs are written to cope with sarcasm, etc. They’re even able to analyse public statements by corporate bosses, and check them for honesty! Thomson Reuters MarketPsych Indices (TRMIs) tracks and quantifies attitudes, tones, themes, and other psychological aspects.
Plans are now afoot to write programs which can“automatically digest broadcast and closed-caption television.”
Of course, these techniques may not necessarily be used in isolation – Lexalytics was put to use to track activists at the NATO summit, building identity profiles of individuals by combining “realtime social media monitoring with sentiment analysis”.
Crafting the news
When attributes of an identity are gathered through multiple forms of surveillance, they begin to tell a story about that person; in the same way, elements of news data can be reformulated as a story, or news article. Meaning is extracted, and crafted to form a new narrative. This is the art of storytelling, turned into a science by the likes of DARPA, which has been very busy uncovering the universal properties of narratives, with the aim of promoting preferred ideologies within given cultural populations. They’re interested in how people react psycho-chemically and behaviourally to the news they hear, the stories they’re told, and how these can be ‘shaped’ so as to control societies or individuals.
Stories have been found to influence people’s emotions and cognition; their decision making, memory, and reasoning, and the formation of identity.The perspective from which a story is narrated is also crucial. Stories can trigger neuro-chemical responses, which then influence the behaviour of an individual.
The release of dopamine, for instance, can be triggered by a story, and is linked to the reward mechanism in the brain. Oxytocin is another commonly triggered neurochemical, linked to the modulation of trust. Telling a story right could mean ‘winning hearts and minds’:
There is some emerging work being done on how Tweets can create a rise in oxytocin release based on message content. This kind of work could be critically important to understanding how groups form and how we can influence their formation.
The logical structure (i.e. plot) of narratives has been found to have universal patterns, typically following Freytag’s triangle, of beginning, middle, and end. Scientists are able to map these patterns as ‘narrative networks’ on computers, and manipulate their elements in such a way as to exploit their effects, such as carrying a message or moral. The reactions of the viewer are mostly subliminal; you don’t even know it’s happening.
News articles are just one type of ‘story’, as are the stories we tell each day to recount our experiences, and to make sense of the world. The art of storytelling is also being used by marketers to bring life to brands, and by policy makers seeking to influence opinion.
Narrative networks can be modelled using universal plot lines and even characterful expressions, which can then be used to generate reports from news data fed to them in real-time. So not only are there programs which can read the news, there are programs which can actually write the news. No human involved, just data being processed and re-hashed as articles for humans to read, and traders and algorithms to react to.
Narrative Science is a company which can generate content in article format from large data sets; in fact, the company offers the service as a way for corporations to make sense of the mass of information now available. Re-formulated as a story, the data is manageable and therefore rendered as actionable intelligence, rather than a big confusing mess.
Narrative Science is able to manufacture tailored stories from any given data set:
Imagine if you could create stories with amazing quality and at a scale and speed currently not attainable using people alone, then imagine creating multiple versions of the same story, with each story’s content customized for different audiences and tailored to fit a particular voice, style and tone. Our system provides publishers with an innovative and cost-effective solution for creating high-quality, timely stories. Our technology frees up existing editorial resources to focus on stories that either require deeper investigation, or to create stories that would not otherwise be written.
We can create content on just about any topic including financial, sports, real estate, politics and more.
When asked what percentage of news would be written by computers in 15 years, Kristian Hammond from Narrative Science sighed and said, “More than 90 percent.”
(“When you control the narrative, you control the conversation”, Rick Horowitz)
Making the news?
‘Buy the rumour, sell the news’ sums up a well-known trading strategy, handed down from Richard Wyckoff in the 1930s, and refers to the ability to profit from ‘rumour’ headlines and timely action in the stock market.
… evolve and rebuild themselves. They test new strategies and if specific profit metrics are met and the strategy stays within the designated risk boundaries, an algo made last year could be completely unrecognizable today.
Like the other global ecosystems, it has gotten too complex, and that makes it weak.
Now that “flash crashes are only a tweet away”, news management has reached its most critical stage.
A.I. news and the A.I. economy make up a global, digital, trackable ecosystem which can be manipulated (whatever the currency) using the tools of complexity science, but no-one can watch the algorithms, and, as long as there’s competition, the game is on. The possibility therefore remains that some might want to go one step further, and actually create the news in the first place. After all, you can’t beat that, can you?
FURTHER INFORMATION AVAILABLE IN THESE VIDEOS:
TEDxNewWallStreet – Sean Gourley – High frequency trading and the new algorithmic ecosystem: http://www.youtube.com/watch?v=V43a-KxLFcg
Kevin Slavin: How algorithms shape our world:
TEDxConcordia – Yan Ohayon – The Impact of Algorithmic Trading:
This article first appeared at Get Mind Smart
Julie Beal is a UK-based independent researcher who has been studying the globalist agenda for more than 20 years. Please visit her website, Get Mind Smart, for a wide range of information about Agenda 21, Communitarianism, Ethics, Bioscience, and much more.