- Daily Update from Securities Docket
- Posts
- Investors Now Using AI to Analyze Executives' Speech Rate, Tone, Pitch During Earnings Calls
Investors Now Using AI to Analyze Executives' Speech Rate, Tone, Pitch During Earnings Calls
Plus the SEC says companies should disclose AI plans, risks and investments.
Good morning! Here’s what’s up.
Contest Winner
Yesterday’s contest was to identify the person depicted in the AI-generated photo below, which appeared in a recent article. As a clue, I noted that the person was relevant to the securities enforcement world and had been mentioned in this newsletter before. The first to email me with the correct answer won their choice of “scandal ware” mugs from our collection.
Six people emailed me a guess. All six got it right: John Reed Stark. The first to email me with the correct answer was Andrew Feller, so congrats to Andrew—your Madoff mug is on the way!
Here is the article that “AI John Reed Stark” appeared in:
By the way, here is a real photo of Stark from his Twitter bio:
People
Russell Fecteau, a former senior enforcement attorney at FINRA, has joined Davis Wright Tremaine LLP as Of Counsel in Washington, D.C.
Clips ✂️
Investors use AI to glean signals behind executives’ soothing words
A contentious $US8 billion ($12.6 billion) takeover of cancer screening business Grail had prompted a campaign by activist investor Carl Icahn, fights with competition authorities on both sides of the Atlantic, and criticism from Grail’s founding directors.
Mr deSouza told analysts the drama was only affecting “a very small part of the company”.
But each time he was asked about Grail, there were shifts in his speech rate, pitch and volume, according to Speech Craft Analytics, which uses artificial intelligence to analyse audio recordings. There was also an increase in filler words like “um” and “ah” and even an audible gulp.
***
Mr DeSouza resigned less than two months later.
The idea that audio recordings could provide tips on executives’ true emotions has caught the attention of some of the world’s largest investors.
Many funds already use algorithms to trawl through transcripts of earnings calls and company presentations to glean signals from executives’ choice of words – a field known as “Natural Language Processing” (NLP). Now they are trying to find further messages in the way those words are spoken.
👉 Matt Levine observes here that AI robots have been reviewing earnings call transcripts for a while now, but that the “listening” element is new. His solution for companies in this cat-and-mouse game with AI:
“One move is to get acting training. But clearly the better move, for executives, is:
Get a chatbot to write your earnings presentation, and responses to analyst questions.
Get a robot to read it for you in a soothing and convincing way.
Then the investors’ robots will listen to it and be like “oh yes this CEO is very good, very confident, I like the cut of this CEO’s jib.”
Just cut out the human element everywhere.”
AI Plans, Risks Should Be Clear to Investors, SEC Official Says
The SEC is open to issuing disclosure guidance specific to artificial intelligence, but for now companies should give investors a window into how the business relies on or invests in the emerging technology, an agency official said Monday.
“It’s powerful I know to sometimes just use buzzwords, but it’s even more powerful to give the information to investors so they can understand and appreciate how it’s being used,” said Lindsay McCord, chief accountant of the Division of Corporation Finance at the Securities and Exchange Commission, in remarks during a Financial Executives International online conference.
👉 McCord also reportedly stated in her remarks that “SEC officials are looking for robust disclosures from companies this annual report filing season detailing the threat to their businesses posed by interest rates that remain at historic highs.”
Artificial Intelligence, the SEC, and What the Future May Hold
Given the current and proposed regulatory framework, it is vital for broker-dealers and investment advisers to have a firm understanding of the AI tools they use and then implement appropriate policies and procedures for those AI tools. Firms should not wait to assess their use of AI, including future use, and put guardrails in place to ensure customers are protected and the firms satisfy all regulatory expectations.
Firms should begin by assessing what AI technology they are actually using or plan to use. After this is complete, assessing whether such use presents any conflicts of interest, potential customer harm, or violation of applicable rules and regulations is recommended. Firms should also consider keeping an inventory of all the AI applications they use, the risks posed by each AI application, and mitigating controls to address each AI-related risk.
Next, firms should implement and periodically review their written policies and procedures to address AI governance and the regulatory risks posed by AI. Any existing policies and procedures may be similarly enhanced to address conflicts of interest related to AI, potential customer harm, and potential regulatory violations. For example, firms may determine to be deliberate and intentional about their use of any new AI systems, explicitly requiring review and assessment of such AI before personnel are permitted to use it. Further, supervision by cross-function teams and periodic testing is also helpful to understand how the AI systems are performing.
If you created a bitcoin wallet before 2016, your money may be at risk
After a tech entrepreneur and investor lost his password for retrieving $100,000 in bitcoin and hired experts to break open the wallet where he kept it, they failed to help him. But in the process, they discovered a way to crack enough other software wallets to steal $1 billion or more.
On Tuesday, the team is releasing information about how they did it. They hope it’s enough data that the owners of millions of wallets will realize they are at risk and move their money, but not so much data that criminals can figure out how to pull off what would be one of the largest heists of all time.
Are firms prepared for wave of messaging fines headed for Britain?
The last few years have seen market regulators dole out record numbers of fines to financial institutions over their failure to adequately monitor employee communications on instant messaging applications like WhatsApp. According to Securities and Exchange Commission (SEC) enforcement division chief Gurbir Grewal, the regulator has filed charges against 40 financial firms and imposed more than $1.5bn (£1.2bn) in fines for such failures since December 2021.
While there is no denying certain regulators have cracked down hard on firms for their messaging practices, this may only be a precursor to much more significant action over the coming years – particularly for firms headquartered in the UK. So far, regulators in Britain have paid little attention to the behaviour of traders conducting business through text and similar platforms that evade regulatory oversight. This is despite the use of instant messaging apps growing increasingly common among staff following the outbreak of Covid-19, when traders were forced to work and communicate via different channels. However, it seems this may be about to change, and firms operating in the UK must ensure they are well prepared for a tidal wave of investigations as soon as 2024.
Guest Post: CISO Liability in Focus: SEC Enforcement, Insurance, and [Personal] Risk Mitigation
After years of relatively gentle guidance when it came to disclosing cyber risk and cyber breaches, the SEC signaled that the kid gloves had come off when it proposed and ultimately adopted its new cyber disclosure rules.
The SEC had also already been signaling its changing enforcement posture, including a 2021 penalty of $500,000 it imposed on American Title Company and a $1 million penalty on Pearson plc for disclosure issues related to cyber events.
After all of this, no one should be surprised that the SEC is now making enforcement personal. It’s typical for the SEC to look for a particularly strong case to make its point. SolarWinds fits the bill.
***
Being a CISO is hard enough; these folks need to be able to sleep at night. Indeed, companies that take steps to protect their CISOs will, in the long run, have the most effective CISOs. Training a CISO on relevant corporate governance issues, making sure you have appropriate cyber insurance, and especially providing a CISO with an indemnification agreement and protection under the company’s D&O insurance program will increasingly become table stakes for talented CISOs. And these are, after all, exactly the people companies need to lead the charge when it comes to avoiding and mitigating devastating cyber catastrophes in the first place.
The so-called "moderate position" on crypto, that there are scams but there is an underlying technology with value is has been completely discredited by everyone from Nobel laureates to software engineers.
There is no moderate position anymore. It's asbestos of finance. Ban it.
— Stephen Diehl (@smdiehl)
7:17 AM • Nov 14, 2023
GOP riders to stop @SECGov rules face difficult legislative path @joejava210@LxrPolicy@IAA_Today
— Mark Schoeff Jr. (@MarkSchoeff)
10:39 PM • Nov 9, 2023
Prediction on what the future of the office will look like, from a 1979 Xerox commercial.
— Historic Vids (@historyinmemes)
6:01 PM • Nov 13, 2023
Every finance and crypto podcast moving on from SBF trial
— Kyle S. Gibson (e/cult) (@KyleSGibson)
1:35 PM • Nov 14, 2023