Quant Macro Investing

Risk Taking Disciplined

Analyst Recommendations

From Financial Times (March 14, 2010) , click for full article

(Financial Times March 14, 2010)…It found that when an average stock burdened with a consensus “sell” recommendation is given a “buy” rating, the underperforming price turns round and, after 100 days, the stock can be expected to outperform the market by about 2 per cent. And when a large broker issues the recommendation, the effect is almost half as much again.

 

 

 

 

 

 

 

 

 

 

 

Editing Assisstant: Katherine Xu

 

March 17, 2010 Posted by | Uncategorized | Leave a comment

Four Ways of Looking at Twitter

Data visualization is cool. It’s also becoming ever more useful, as the vibrant online community of data visualizers (programmers, designers, artists, and statisticians — sometimes all in one person) grows and the tools to execute their visions improve.

Jeff Clark is part of this community. He, like many data visualization enthusiasts, fell into it after being inspired by pioneer Martin Wattenberg‘s landmark treemap that visualized the stock market.

Clark’s latest work shows much promise. He’s built four engines that visualize that giant pile of data known as Twitter. All four basically search words used in tweets, then look for relationships to other words or to other Tweeters. They function in almost real time.

“Twitter is an obvious data source for lots of text information,” says Clark. “It’s actually proven to be a great playground for testing out data visualization ideas.” Clark readily admits not all the visualizations are the product of his design genius. It’s his programming skills that allow him to build engines that drive the visualizations. “I spend a fair amount of time looking at what’s out there. I’ll take what someone did visually and use a different data source. Twitter Spectrum was based on things people search for on Google. Chris Harrison did interesting work that looks really great and I thought, I can do something like that that’s based on live data. So I brought it to Twitter.”

His tools are definitely early stages, but even now, it’s easy to imagine where they could be taken.

Take TwitterVenn. You enter three search terms and the app returns a venn diagram showing frequency of use of each term and frequency of overlap of the terms in a single tweet. As a bonus, it shows a small word map of the most common terms related to each search term; tweets per day for each term by itself and each combination of terms; and a recent tweet. I entered “apple, google, microsoft.” Here’s what a got:

twittervenn.jpg

Right away I see Apple tweets are dominating, not surprisingly. But notice the high frequency of unexpected words like “win” “free” and “capacitive” used with the term “apple.” That suggests marketing (spam?) of apple products via Twitter, i.e. “Win a free iPad…”.

I was shocked at the relative infrequency of “google” tweets. In fact there were on average more tweets that included both “microsoft” and “google” than ones that just mentioned “google.”

So then I went to Twitter Spectrum, a similar tool that compares two search terms and shows which words are most commonly associated with each term and which words are most commonly used in tweets with both terms. Here’s the “google, microsoft” Twitter Spectrum:

twitterspectrum.jpg

I love that the word “ugh” is dead center between Google and Microsoft. But the prominence of social media terms on the blue side versus search terms on the red side is fascinating. It looks like two armies marching at each other ready to fight different wars.

Clark has also created TwitArcs. This one, I feel, is still a work in progress and Clark says “visually I like it but it might be the least useful so far.” In this case, you type in a tweeter’s handle and it returns a stream of that person’s tweets with arcs that link common words between tweets (on the right) and common retweeters (on the left). Rolling your mouse over highlights the last tweet in the arc. Here’s a TwitArc of @timoreilly:

twitarc.jpg

Finally, the Stream Graph. Enter a search term and Clark’s engine returns the frequency of the most common words found with your search term for the last 1,000 tweets. You see a literal flow of conversation. You can also highlight one term to see how its frequency changed over time and you’ll see the most recent tweets that include both your search term and that highlighted term.

Sometimes 1,000 tweets with your term may span weeks. For my search term, “Tiger Woods” which I entered yesterday afternoon right after news that he’d speak publicly broke, 1,000 tweets covered about 20 minutes. Here’s the “Tiger Woods” stream graph with “silence” highlighted:

streamgraph.jpg

It isn’t hard to imagine how this may be applicable to business. I can already see eager marketers watching the stream flow by as their commercial debuts during next year’s Super Bowl.

Clark, like many data visualizers, believes we’re on the front end of a revolution in information presentation. “There’s a lot of work done called scientific visualization or business intelligence graphics,” he says. “And it’s pragmatic, trying to solve practical problem. It’s all standard, a bar chart or pie. But those standard ways are not adequate when you’re trying to mine a richer data space. The world is full of complex data and we’re just starting to get the tools to make sense of it. We’re looking for new ways of presenting data.”

For the original file, please click here.

February 23, 2010 Posted by | Uncategorized | Leave a comment

Margin Debt & Stock Market Returns

I ran a little experiment to see how the growth or decline of NYSE margin debt correlates with stock market returns. Before conducting the experiment, I expected that high rates of margin debt growth would mark periods of speculative excess, and therefore result in low future stock market returns.

The average 1yr rate of margin debt growth on the NYSE since 1959 is 11.18%.

For my experiment, I calculated rolling forward 2yr cumulative returns on the S&P 500 for all periods since 1959. Next, I divided the periods since 1959 into ‘above average’ and ‘below average’ margin debt growth groups.

Here are the results:

1. Average 2yr stock market return after all periods: 22.97%

2. Average 2yr return after periods with below average margin debt growth: 23.27%

3. Average 2yr return after periods with above average margin debt growth: 18.72%

Bottom Line: the results illustrate a moderate-to-weak relationship between above average margin debt growth and below average future stock returns.

Incidentally, 2yr returns after margin debt grew by 40% and 60% were 6.97% and 6.7% respectively. This supports the thesis that above average margin debt growth leads to below average stock market returns. However, it also shows that the relationship is not linear since the stock market returns stopped declining by a meaningful degree after margin debt growth surpassed 40%. Further clouding the relationship, there also were periods (e.g. 1983) with very high margin debt growth and double-digit stock market returns.

How fast is margin debt growing today? For the 12mths ending December 2009 NYSE margin debt grew by 23.66%. But given the results of my experiment, I wouldn’t rely on margin debt growth to anticipate future market returns.

This is not advice. None of this information is guaranteed to be accurate and should not be relied on. Investing involves risk and you could lose all your money. Consult a financial advisor before making any investing decisions.

For the original file, please click here

February 23, 2010 Posted by | Uncategorized | Leave a comment

Copper/Gold Ratio

[New+Picture+(26).png]

For the link, please click here.

February 23, 2010 Posted by | Uncategorized | Leave a comment

Ultimate Guide To Becoming A Quant By Mark Joshi

Ultimate Guide To Becoming A Quant By Mark Joshi

December 10, 2009

Very interesting overview of the quant world, if nothing else it will give you an overview of quant jobs and the lunacy or (brilliance depending your view) of wall st.

Click Here To Read:  Ultimate Guide To Becoming A Quant

What sorts of quant are there?

(1) Front office/desk quant
(2) Model validating quant
(3) Research quant
(4) Quant developer
(5) Statistical arbitrage quant
(6) Capital quant

A desk quant implements pricing models directly used by traders.Main plusses close to the money and opportunities to move into trading. Minuses can be stressful and depending on the outfit may not involve much research.

A model validation quant independently implements pricing models in order to check that front office models are correct. Plusses more relaxed, less stressful. Minusses model validation teams can be uninspired and far from the money.

A Research quant tries to invent new pricing approaches and sometimes carries out blue-sky research. Plusses it’s interesting and you learn a lot more. Minusses sometimes hard to justify your existence.

A Quant developer, a glorifed programmer but well-paid and easier to find a job. This sort of job can vary a lot. It could be coding scripts quickly all the time, or working on a large system debugging someone else’s code.

A Statistical arbitrage quant, works on finding patterns in data to suggest automated trades. The techniques are quite different from those in derivatives pricing. This sort of job is most commonly found in hedge funds. The return on this type of position is highly volatile!

A capital quant works on modelling the bank’s credit exposures and capital requirements. This is less sexy than derivatives pricing but is becoming more and more important with the advent of the Basel II banking accord. You can expect decent (but not great) pay, less stress and more sensible hours. There is currently a drive to mathematically model the chance of operational losses through fraud etc, with mixed degrees of success.

People do banking for the money, and you tend to get paid more the closer you are to where the money is being made. This translates into a sort of snobbery where those close to the money look down on those who aren’t. As a general rule, moving away from the money is easy, moving towards it is hard.

December 11, 2009 Posted by | Uncategorized | Leave a comment

Here’s How High Frequency Traders Dominate The Markets

Here’s How High Frequency Traders Dominate The Markets

Vince Veneziani|Dec. 7, 2009, 3:24 PM |

An interesting PDF concerning latency arbitraging in the world of HFT has been released by Themis Trading. In it, they explain just how HFT works with computers doing all the heavy lifting, though that quote about 60% of daily volume is way off. I’ve heard reports of 3 to 5-percent:

Themis Trading: Here’s an example of how an HFT trading computer takes advantage of a typical institutional algo VWAP order to buy ABC stock:

1. The market for ABC is $25.53 bid / offered at $25.54.

2. Due to Latency Arbitrage, an HFT computer knows that there is an order that in a moment will move the NBBO quote higher, to $25.54 bid /offered at $25.56.

3. The HFT speeds ahead, scraping dark and visible pools, buying all available ABC shares at $25.54 and cheaper.

4. The institutional algo gets nothing done at $25.54 (as there is no stock available at this price) and the market moves up to $25.54 bid / offered at $25.56 (as anticipated by the HFT).

5. The HFT turns around and offers ABC at $25.55 or $25.56.

6. Because it is following a volume driven formula, the institutional algo is forced to buy available shares from  the HFT at $25.55 or $25.56.

7. The HFT makes $0.01-$0.02 per share at the expense of the institution.

It is currently estimated that HFT accounts for 60% of all share volume.

December 9, 2009 Posted by | Hi Freq Trading (HFT), Uncategorized | Leave a comment

Goldman Claims Momentum And Value Quant Strategies Now Overcrowded, Future Returns Negligible

Goldman Claims Momentum And Value Quant Strategies Now Overcrowded, Future Returns Negligible

Even as momentum buyers keep driving the market to new 2009 highs today on a worse than expected ISM numbers (more Obamoney coming), none other than Goldman Sachs head of quant strategies Robert Litterman says that with everyone on the same side of the trade in momentum and value quant strats, the returns to these strategies are rapidly becoming negligible due to overcrowding. Of course, what happens when the crowds disperse is anyone’s guess, although if Obama had anything to say about it, the exit would be cool, calm and collected. Obviously it will be anything but. And very limited upside also means very unlimited downside. Yet let he who wants to fight the Marriner Eccles lunatics cast the first short.

More from Reuters:

Computer-driven hedge funds must hunt for new areas to exploit as some areas of making money have become so overcrowded they may no longer be profitable, according to Goldman Sachs Asset Management. Robert Litterman, managing director and head of quantitative resources, said strategies such as those which focus on price rises in cheaply-valued stocks, which latch onto market momentum or which trade currencies, had become very crowded.
Instead he said opportunities could come in areas such as event-driven strategies — which focus on special events such as mergers or restructuring — and catastrophe reinsurance, although he added they can just as quickly disappear.

He also pointed to credit, emerging markets, volatility trading and commodities.

That’s all we need: computers trading on event situations, where the first three letters of the headline will be sufficient to throw any thinly traded stock into a parabolic rise or drop. Whatever happened to good old fashioned humanitarian trading?

Yet isn’t it ironic that none other than Goldman which lost billions in August 2007 when the quants went haywire most recently, should be warning about the dangers of overcrowded groupthink?

“You have to adapt your process,” Litterman said at the Quant Invest 2009 conference. “What we’re going to have to do to be successful is to be more dynamic and more opportunistic and focus especially on more proprietary forecasting signals … and exploit shorter-term opportunistic and event-driven types of phenomenon.” Computer-driven or quantitative hedge funds attempt to make money by quickly exploiting trends or anomalies in markets such as equities, government bonds or currencies.
However, some funds such as Goldman’s controlled a large share of some markets in summer 2007 and many were caught in a vicious circle of selling. “I think the world has fundamentally changed for quants,” he said, adding that his funds now allocate a greater share of assets to newer strategies since that crisis.
“We’re putting together data that’s not machine-readable, finding databases that haven’t been explored nearly as well as others, identifying linkages across companies and industries and finding patterns in the data that are not as well known.”

Yeah right, things are so different than August 2007, when insane parabolic melt ups each and every day were so unique and so completely different to those from… today. The poetic justice of Goldman’s trading P&L imploding after several “unique” haywire strategies end up losing the firm a cool couple of billion will be unsurpassed. Not only that, but the statistical garbage that is GS’ VaR will finally be exposed for the sham mathematical artefact it is. Until then we can all wait and hope.

December 7, 2009 Posted by | Uncategorized | Leave a comment

Barron’s Red Flags: Do They Actually Work?

Barrons Red Flags: Do They Actually Work?

Tim Loughran
University of Notre Dame

Bill McDonald
University of Notre Dame

November 20, 2009

Abstract:
Investors are often concerned that managers might hide negative information in the maze of mandated SEC filings. With advances in textual analysis and the availability of documents on EDGAR, individuals can quite easily search for phrases that might be red flags indicating aggressive accounting practices or poorly monitored management. We examine the impact of 13 suspicious corporate phrases identified by a recent Barron’s article in a sample of 50,115 10-Ks during 1994-2008. There is evidence that red flag phrases like related party and unbilled receivables signal a firm may subsequently be accused of fraud. At the 10-K filing date, phrases like substantial doubt are linked with significantly lower filing date excess stock returns, higher stock return volatility, and greater analyst earnings forecast dispersion.


http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1510188

December 7, 2009 Posted by | Uncategorized | Leave a comment

Thomson Reuters woos algo traders with machine-readable company events

Thomson Reuters woos algo traders with machine-readable company events

01 December, 2009 – 11:01

Reflecting the growing influence of computer-driven trading strategies, Thomson Reuters has expanded its machine-readable news offering to include real-time analysis of company events.

The news and information group says the automated process will scan and automatically extract critical pieces of information from corporate announcements for clients to use in a machine readable format.

The information will be delivered in an XML format via Thomson Reuters NewsScope Direct, the firm’s low latency news distribution platform. In addition, clients will have access to nearly seven years of historical data, enabling them to back-test in their trading and investment strategies.

Initially, NewsScope Company Events will focus on US newswires before expanding to Canada and Europe.

The provision of market-moving news and information in a machine-readable format has become a key battleground for market data firms as increasing volumes of stocks are executed by computer algorithms.

Late last month, Deutsche Börse acquired machine-readable data specialist Need to Know News, in an effort to raise the profile of its market data and analytical offerings with algorithmic traders.

Thomson Reuters to offer company events data in real time

By Davis D. Janowksi
December 1, 2009

Advisers using the Thomson Reuters’ Quantitative and Event Driven Trading solution set will soon have one more morsel of data coming their way to help them gain an edge in their trading strategies.

Thomson Reuters today announced that it has expanded its machine-readable news offering to include real-time analysis of company events.

Within the company’s wide array of market data offerings, users will find these particular new points of information in the feeds from NewsScope Direct, which is delivered in XML format.

The company events data are being culled from U.S. company press releases. The info is then scanned by computer.

Initial focus will be on US newswires. Thomson Reuters plans to expand its coverage to Canadian and European company news in coming weeks and months.

Pricing and availability specifics were unavailable at press time

http://thomsonreuters.com/products_services/financial/financial_products/quantitave_event_driven_trading/

http://thomsonreuters.com/products_services/financial/financial_products/quantitave_event_driven_trading/high_frequency

December 7, 2009 Posted by | Uncategorized | 2 Comments

Google Zeitgeist Global Overview

Some pretty interesting graphs here:

• Google Zeitgeist Global Overview

• US Overview

Goog zeitgeist

December 2, 2009 Posted by | Uncategorized | Leave a comment