Are Internet Stocks Fairly Valued?

A White Paper

Introduction

        The purpose of this paper is to discuss the current theories that surround the debate of whether Internet stocks are fairly priced or over valued. We first provide background information on a hypothesis that is the foundation of capital market and valuation theories, the efficient market hypothesis. We then discuss the relative merits of linear modeling of valuation methods and hybrids thereof, with specific attention to the unprecedented use of inputs such as customer acquisition cost and number of page views. We will also describe the findings of non-linear modeling of the securities market that call into question not only the traditional notions of what is reflected in shareholder value but also the very foundation of capital market theory itself, the efficient market hypothesis. Finally, we analyze what are fast becoming the new rules of the information economy that may bear on prospects for future growth in a specific class of Internet companies that by the standards of non-Internet companies seem implausible.

Background

        One of the most pressing questions in analyzing capital markets today is "how does one explain the explosive growth in Internet stock prices?" Theorists have been modeling financial market behavior to predict stock price fluctuations using linear models such as stock and debt analysis, cash flow analysis, the direct comparison approach and adjusted book value. These linear models are based on the efficient market hypothesis which says that in equating supply and demand, the market aggregates the opinions of all investors, weighted by the amount they are willing to pay to arrive at an average opinion (Cornell, 1993). But the efficient market hypothesis is criticized for its failure to describe market behaviors such as patterns in stock prices, excess volatility in stock returns and mispricing of securities. Moreover, these models were developed to analyze traditional industrial companies with real assets and real liabilities and do not completely capture the value of a company in markets where the main assets of the company are intangible assets (i.e. intellectual capital). The gap in what appraisers are saying a company’s stock price should be (risk-free returns) and what price stocks are trading at (risk premiums) suggest either:

  1. The market is fairly predicting the value of these companies and the traditional models are failing to capture the true value of Internet companies, or
  2. The market is overvaluing these companies.

        The first response to the inconsistencies in the efficient market hypothesis can be traced to the late 1970s (Sharpe, 1970; Fama and Miller, 1972) when theoretical dissension was signaled by the introduction of non-linear dynamics: the study of turbulence. Non-linear dynamics has existed since the early 1900s, starting with the early work of the mathematician Henri Poincare (1854-1912) who posited that "if a system consisted of a few parts that interacted strongly, it could exhibit unpredictable behavior". it wasn’t until the 1980s that it has been applied to the quantitative analysis of capital markets (Lorenz, Feingenbaum, Ruelle). Models based on non-linear dynamics such as fractals, fuzzy logic and genetic algorithms are starting to be used to describe and predict short and medium term market behavior (Trippi, 1996).

        Capital market theory based on the efficient market hypothesis attempts to make financial analysis neater by making the following assumptions:

        Dynamic systems analysis has shown inconsistencies in these assumptions leading to some researchers believing that the efficient market hypothesis is oversimplifying market dynamics. That in reality they feel that free markets are "dynamic systems that exhibit aperiodic fluctuations"(Chorafas, 1994) and have far more dimensions of complexity than linear models recognize. The hyper-activity of Internet stocks and the gaps in analyst’s ability to describe the unusual prices could be a signal that markets are functioning on the basis of principles that the efficient market hypothesis is not able to reconcile. Thus, non-linear dynamics may call into question the underlying principles of contemporary capital market theory.


 
 

A Look at Conventional Valuation Methods

Adjusted Book Value

        One of the most straightforward methods for calculating the market value of a company is by examining its book value. The book value relies on the assumption that the value of a company is the sum of the values of all the claims investors have on the firm(Cornell, 1993). Using the book value method involves consulting the balance sheet, summing the company’s assets and subtracting the company’s liabilities. The book value method is attractive because of its relative simplicity to other methods. In fact it is frequently cited in the press. But it also its many pitfalls. In this section, those deficiencies will be shown, along with methods for overcoming these obstacles.

        There are two applications of the book value method, the unadjusted book value and the adjusted book value. The unadjusted book value method accepts the book values of assets and liabilities as reported on the balance sheet. However, relying solely on the unadjusted book value is dangerous because it fails to take into consideration several important factors of real economy: inflation, obsolescence, and organizational capital. First in the case of inflation, the decline in the value of the U.S. dollar, will cause the company assets on the unadjusted balance sheet to be undervalued. Second, in the case of obsolescence, an asset having a market value less than its reported book value it is often the result of the asset being supplanted ( or made obsolete) by next generation technology which can affect a company’s market value such that it is less than its book value. Third, and probably most importantly when you consider Internet companies, the additional value that arises from the intangible assets of the company, the organizational capital, is not represented in the balance sheet. This includes such things as employees and the synergistic relation among them.

        Because the (unadjusted) book value method has proven to be deficient in its accounting for market changes, methods exist for adjusting the historical book value of a company. These methods more accurately reflect the market value of the company. The application of these adjustments is known as the adjusted book value method. Two common adjustments are replacement cost, and liquidation value. Again, it should be noted that neither of these two adjustments incorporates organizational capital.

        Adjusting the book value to reflect replacement cost substitutes the book values of assets with current-day replacement costs. The difficulty with this method is that direct replacements rarely exist. For example, next generation computers supplant older generations can represent both a replacement as well as an upgrade. Thus the substitution cost also includes the cost associated with the upgrade. Therefore, careful judgement must be exercised when attempting to separate the components of the substitution cost.

        Adjusting the book value to reflect the company’s liquidation value in terms of the sum of the market value of its individual assets. The problem that exists is the same as for obsolescence: some assets of the company may not have a direct secondary market. The managers of the company must take care when pricing assets whose value cannot be directly measured from the market.

        Given the deficiencies that exist in both the unadjusted as well as the adjusted book value method, when should this method be applied? The answer is in industries where the value of the company lies largely in its tangible assets. This commonly includes regulated industries, including water, gas, and electricity utilities. Therefore, one should conclude that the book value method (both unadjusted and adjusted) does not seem to be a valid indicator of the value of a company stock. Moreover, the book value method should not be used to value companies where a large portion of the market value is derived from organizational capital such as Internet companies.

Direct Comparison Approach

        The Direct Comparison Approach (DCA) is used in the process of establishing a price for private held companies either at the time of acquisition or at the time of initial public offering. The principle of the DCA is that "similar assets should sell at similar prices" (Cornell, 56).

        DCA estimates the value of a target company using available data on observable variables from both the target company and from a sample of publicly traded companies. In addition, a value indicator (share price in this case) is needed from the latter. Taking the ratio of the value indicator to the observable variables of the comparable companies and multiplying it by the observable variables of the company being appraised then makes the comparison. The critical assumption is that these ratios are approximately equal for both comparable company and target company.

        By this method, if the stock prices of the sample of comparable companies are overvalued, then the outcome will be an overvalued stock price for the target company as well. Similarly, the same analogy can be made if the stocks are valued fairly. This method does not provide any support to either side of the dilemma "are Internet stock overvalued?" simply because its underlying assumption already supports one of the two sides, and is consequently biased. Therefore, the Direct Comparison approach does not help to resolve the problem in question.

Discounted Cash-flow approach

        On October 1, 1974, The Wall Street Journal published an editorial lamenting the fact that many people did not use Discounted Cash-Flow (DCF) analysis to value a company. The DCF methodology estimates the amount of positive and negative cash flows that a company will generate in the future and discount the value of these future cash flows to today's value. Traditionally DCF is based on expectations of future sales and future discount rate.

        Given the amount of debate that surrounds how Internet companies are valued, it is worthwhile to look at a hybrid of the DCF model. For example, an Internet company like AOL can be valued by placing a monetary value of $15,000 on each subscriber (WSJ, April 16, 1999). Suggesting that stock prices will increase as more subscribers subscribe to AOL. But if AOL increases market share by slashing subscriber fees, value might decline rather than increase.

        The essential problem with the dollars-per-subscriber approach used by many industry analysts is that it does not value what is directly important to investors. Investors can not buy a house or a car with subscribers. Nor can they use subscribers to make additional investments. Only payouts generated by the business’ cash flow can be either distributed to company's shareholders, or reinvested. The price-per-subscriber approach is useful only when the number of subscribers is a good proxy for cash flow. But this is not the case for Internet companies where the cash flow generated by each customer changes almost continuously even within the same company.

        The previous discussion give rise to the following question: What valuation tool is most consistent with the goal of long-term value creation? Of the valuation tools described in the above, the DCF is more inclusive than the other models, specifically the expert opinion of appraisers on the company’s prospects for growth and degree of risk associated with the company as reflected in the future discount rate.

        As an example of DCF analysis comparing Internet companies to a non-internet company we constructed a pseudo-discounted cash-flow model. The assumptions for our evaluation are the following:

        In table 1, the actual price per share as valued in the market ranges from 9.7 times (in the case of At Home) to 26.2 times higher than the one predicted by the DCF method in the case of Ebay.

Forecasts for the growth rates that are available through Yahoo Finance.
 
Company
Market Value
pseudo-DCF value
Delta
Market Value/ pseudo DCF value
America on line
$139.75
$14.12
$125.63
9.9x
At Home
$144.94
$14.94
$129.99
9.7x
Compaq (control variable)
$23.63
$23.98
($0.35)
1.0x
Ebay
$176.00
$6.71
$169.29
26.2x
Yahoo
$189.19
$18.45
$170.74
10.3x

Table 1: Comparison of Internet stock valuation

        We tested our results with a non-Internet company, Compaq Computer and our findings are that the value obtained with the pseudo-DCF is almost the same as the one given by the market. This illustrates that when using the DCF model, Internet stocks are overvalued.
 
 

New Metrics for Measuring the Value of Internet Companies

        In today's Internet stock market, some analysts believe that many of the conventions used in valuation are out-moded by the innovative business models of Internet companies. Metrics such as price-to-earnings ratio and discounted cash flow, do not apply to the new way of doing business. In many cases the companies are in a pre-profit stage as in the cases of iVillage who lost 43.7 million on revenues of $15 million or TheStreet.com who reported losses in 1998 of $16.3 million on revenues of $4.6 million (NYT, March 29,1999). In other cases there are dependencies on sales having to do with the rate expansion Internet access and use. Consequently, new methods are being devised to explain the phenomenon of unusually high Internet stock prices. A complete list of such methods has been discussed in an article by Steve Harmon, The Metrics for Evaluating Internet Companies, and will be briefly analyzed in this section. http://www.internetnews.com/stocks/column/article/0,1087,41_71961,00.html

        For example, one innovative method measures market capitalization / number of page views. This ratio is an expression of the value received by investors based on how many people viewed specific web pages. The assumption is: "the more people that view a company's Web page, the higher the value of that company". Similarly, a ratio that uses market capitalization per ad views assigns a higher value to active pages than to inactive pages.

        Another metric that has recently gain tremendous popularity among analyst communities is the market capitalization to revenue ratio. In some respects the market cap/revenue relationship is similar to the P/E; however, analysts feel that because many internet companies are in pre-profit stages it may be more descriptive for the unique market for Internet stocks.

        Customer acquisition cost (the cost of acquiring a new subscriber or buyer) and revenue per subscriber might be a more useful way of valuing shares for companies such as ISP's, e-retailers, auctioneers. Though it is true that these new ideas on valuation are gaining momentum among the Internet analyst community, it is also true that they have evolved to address a need for valuing early-stage companies operating in a nascent, and not surprisingly, unstable market.
 
 

Is the Efficient Market Hypothesis Sound Footing for Capital MarketTheory?

        Critics of the current paradigm for financial analysis point to flaws in the very hypothesis upon which capital market theory and valuation theory rest, the efficient market hypothesis. Since the 1960’s, quantitative capital market theory such as statistical analysis, modern portfolio theory and corporate valuation is predicated on the efficient market hypothesis (EMH). The EMH states that prices reflect "all that is knowable" (Fama 1965). There are strong, semi-strong and weak versions of this theory based on the interpretation of "all that is knowable". A strong version of EMH includes all public and private information, semi-strong EMH includes all public information (Peters, 19) and weak includes all past trading information including prices volume data (Cornell, 38). Efficient markets are priced at a discount based on the assumption that today’s changes in price is caused only by today’s unexpected news (trippi, ). Past information does not affect market activity once the information is generally known.

        Therefore, today’s return is unrelated to yesterday’s return. If the returns are independent then they occur randomly. Fundamental analysts and individual investors formulate value based on information that’s available to all investors. In the aggregate independent estimates result in fair value; therefore, analysts individual investors become the reason markets are efficient. By the shear fact that there is a market for securities suggest that when the stock price falls the opinion of the market is that the stock is over-priced. As we have seen, linear valuation methods based on the EMH overwhelmingly show that Internet stock prices are overvalued. However, In the context of market conditions that result in unusually high stock prices such as that which we see in the market for Internet stocks, the EMH prices cannot be "unusually high" because they always reflect current information. Linear appraisers would explain the gap between their opinion and that of the market as evidence of market inefficiencies that cannot be adjusted for with any measure of certainty. Therefore, if there is no scientific evidence that explains large gaps between a linear appraiser’s valuation and the market’s value of a security (ie. Internet stock prices) the security is over (or under) valued by the market.

        The efficient market hypothesis EMH is one of the foundations of valuation and modern portfolio theories. The EMH make several oversimplifying assumptions in order to make modeling free markets simpler to model. First, the assumption in the EMH that investors are rational, are risk adverse and will act on all information as it is received (in a linear fashion) assumes away many ways in which individuals react to information. Individuals will also act to a series of events. They may delay reactions to information until trends are in place and then react cumulatively to all the previous information that was earlier ignored or that may not react to the information in rational ways. Consider the example of a gambler who looses becomes increasingly risk seeking rather then risk adverse. The assumption that investors react linearly implies that returns should be (approximately) normally distributed and independent (Peters, ). However the phenomenon of serial dependencies such as inter- market or inter-stock dependencies and turn-of-the-year phenomenon are also characteristic of securities markets. But the EMH casts aside these market realities as assumptions because they complicate the linear models that econometrics rely on.

        Second, the EMH assumes that unless there is an exogenous shock to a system there is a natural balance between supply and demand. This is based on an assumption borrowed from Newtonian physics that says if systems are left alone they will seek equilibrium (Peters). It should be noted here that there is an important distinction between how an economist views the state of equilibrium and how a natural scientist views it. To an economist, a system is at rest [equilibrium] if there are no outside or exogenous influences. However, to a natural scientist, equilibrium represents death of a system. If a species in an ecosystem is to survive it must evolve.

        Capital market theory is based on assumptions of finite variance and normality yet empirical evidence consistently shows that the theories are flawed. Variance is stable and finite for normal distributions but in 1990, Turner and Weigal showed that daily returns of the S&P from 1928 to 1990 were not normally distributed but negatively with high concentrations around the mean interspersed with very large or very small returns. But the seminal work around volatility was the work of Shiller, published in his book Stock Market Volatility (1989). He looked at the amount of volatility that should be expected in a rational market’s framework, noting that the" rational investor’s valuation of stocks would be based on expected dividend from owning the stock. His findings were that prices were too volatile to be due to changes in expected dividends, even when adjusted for inflation. There are traders who are value investors and there are "noise" traders who trade according to trends and fashion. (Peters, 1996).

        Moreover, theorists and appraisers who rely on the foundation of EMH have no way to represent market inefficiencies and inconsistencies that exist in free markets or what economists refer to as non-market behaviors such as:

        Linear models based on the EMH will be proven successful only to the extent that the systems are linear. Non-market behaviors present dimensionality to the problem of valuation that is not handled with the modeling tools based on EMH.
 
 

Non-Linear Dynamic Systems Perspective

        The starting point for non-linear theorists are based on the limitations of Newtonian physics. Newtonian physics can explain with a single solution how two bodies interact; however, for problems with more than two bodies there is no single solution. Even though free markets are multi-body systems which evolve they have been viewed at as two body problems in which many important dynamics of the system have been assumed away. Dynamic systems analysis is the study of turbulence in non-linear dynamic systems. Applied in capital market theory, it is the study of non-linear, multi-body systems such as free markets going from a state of stability to turbulence. Non-linear dynamic systems perspective looks at capital markets as "systems of simultaneously systems where the current values of each variable are transformations of past values. This is radically different than saying that prices are set on the basis of new information alone. One very compelling discovery of non-linear dynamic systems has been that "the equilibrium level (attractor) of a non-linear system can be a fractal set within which the system’s state can flit endlessly in a chaotic, seemingly random manner". (Trippi, 5). Equilibrium from a non-linear dynamics perspective can be described in at least two ways: 1) as the level when all states of a system reach a point or value (point attractor) and, 2) a level that has periodic cycles of orbits (Limit Cycle).

        This expanded view of equilibrium accounts for market behaviors such as serial dependencies and excess volatility that traditional capital market theory defines as non-market behaviors.

        Dynamic modeling is concerned with systems in which there can be correlation (Peters) between the components of the system but the relationships can be influenced by outside forces. Unlike linear systems, non-linear dynamic systems are feedback systems. Much like compounding interest, what comes out [price] goes back in [to decision process for setting new price], gets transformed [added to new information] and then comes back out [new price]. The transformation is exponential such that any differences in initial values will grow exponentially.

        For example, capital market theory uses percent change in prices (returns) rather than prices. Measures of percent change aren’t appropriate for linear regressions because there are serial dependencies on price such that each price is related to the price before it. Again, EMH says that today’s prices reflect only today’s unexpected news and thus are independent of any other prices. Based on all known information the market will collectively find the equilibrium price. Thus prices are set randomly and the large number of investors will ensure that prices are fair. But in natural sciences the "object of study" is analyzed rather than the rate of its change in state.

        Prices versus rate of change in prices cannot be used in linear regressions because the model relies on the assumption that prices are randomly set because they reflect unexpected news. Prices in a market that are not randomly set imply that the market is not efficient. Inefficient markets undermine the very foundation that capital market and valuation theory rest. The application of non-linear dynamic systems analysis to financial analysis is still very new and is at the beginning stages of finding new ways to model market dynamics and value. However, to the extent that it raises to our attention that there are serious flaws in the hypothesis that serves as a foundation of conventional capital market theory calls into question the analytical tools that are being applied to valuing Internet companies.
 
 

Value and the power of information

        During the last three years, with the rise of the Internet as a mass communication medium, we have assisted in the birth of many new businesses providing products and services through the Internet. Many economists even talk about a New Economy that has rules that are somewhat different from the ones of the "Old Economy." In this New Economy it is important to consider the following factors:

        The Internet has made possible the full realization of the Information age and the rise of what some researchers call smart markets. A smart market is defined as a market where there is a rapid turnover of information. The ability of a company in capitalizing and using the information that it has about its customers is what will give rise to the new Internet companies' levels of return unimaginable in the industrial age.

         One statistic that could be successfully used in valuing an Internet company is the Life Time Value of its customers/users defined as how much the company will be able to sell to its customers thanks to the usage of the information that the company has about its customers. A company like Amazon.com is strategically positioned to have probably the highest LTV from customers among Internet companies. Amazon.com has a high stock evaluation with values for traditional financial metrics (such as market capitalization over revenues or market capitalization over total assets) that seem crazy because of the potential of the information that it has about its customers.

        Amazon systematically collects data which describes patterns of behavior of its customers such as their search for books, videos or CDs. Based on this information they are able to make recommendations to its customers by cross referencing the behavior of one customer with the one of customers with a similar behavior. This capability creates a lock-in effect that makes an Amazon customers immediately more valuable than a traditional bookstore customer because the traditional bookstore has comparatively little memory about its customers. Amazon has also another incredible source of information about its clients, their credit card number. This source permits the company to trace basically any purchase of its customers and it will drive the company into new markets where the Amazon distribution model can be applied. The potential of growth is really tremendous because if everyone is connected to the same information superhighway, the Internet, with time there will emerge a small number of global selling platforms that will probably be the major "virtual supermarkets" of the information age.

        An article in the Wall Street Journal of April 19, 1999 mention that the level of profitability of Amazon.com or AOL in the maturity stage should be about $4.5billion to justify their valuation. In looking at the Fortune 500 ranking of the 500 biggest American companies we find in the fourth position Wal-Mart with a level of profits that is $4.4billion. We feel comfortable in supporting the thesis that if companies like Amazon or AOL will be able to manage the LTV of its customers they will be able to deliver such a performance. At the beginning of its history no one understood that Wal-Mart was building the shopping and reselling platform of the last part of this century. Today many do not understand that companies like Amazon, AOL and Yahoo are building the core transaction platforms for the economy of the information age, the economy of the next millennium. The market for these transaction platforms will be global and network externalities and lock-in effect will give the winning platform the biggest share of the market.

        The book industry alone had revenues in 1997 of $50 billion, the Music and Video industries had revenues of $200 billion while all the other retail sectors had revenues of $1trillion. Assuming that a company like Amzon.com could capture 75 percent of the market and that it discounts up to 50 percent over the actual prices the calculation takes to potential revenues of Amazon to $468.75 billion.

        These results would be more than enough to justify the high levels of valuations that Internet stocks have today.
 
 

Conclusions

        Non-linear dynamics has shown us that the conditions necessary to justify the use of traditional capital market and valuation theories have never truly existed calling to question the theories for valuation and for predicting market behaviors into question. Although the field of non-linear modeling of capital markets is still young enough that what tools exist are not yet widely used, the field has raised some interesting questions. Taking a hybrid approach to the distribute cash flow analysis in which the new rules of the information economy suggest new inputs into the more accepted systems of financial analysis may be the most appropriate way to determine value.

        Our analysis showed how Internet stocks could be overvalued if we apply traditional cash-flow analysis to Internet companies. We also showed that it is possible to reconcile this traditional cash-flow analysis with the high valuation of Internet stock explaining the fact that some of the new Internet Companies will have level of growth that are unimaginable for traditional companies.

        We believe that in general, Internet stocks are overvalued with some specific exceptions. What is going on in the market is the fact that the high potential of a few leaders drives the valuation of most Internet companies. At some point in the future the market will start distinguishing between these Internet companies and many companies that enjoy high returns today will suffer significant losses in the future as the market exposes the good from the bad.
 
 

References:

Bradford, Cornell, Corporate Valuation: Tools for Effective Appraisal and Decision Making, Irwin Professional Publishing, 1993.

Corafas, Dimitris, N., Chaos Theory in the Financial Markets, Irwin Professional Publishing, 1994

Fama, E.F. and Miller, M.H. The Theory of Finance. Holt, Rinhart and Winston, 1972

New York Times, For Silicon Alley Companies, a Time to Rise and Shine, March 29,1999.

Feigenbaum, M.J. "Universal Behavior in Nonlinear Systems", Physica 7D, 1983.

Lorenz, H. "International Trade and the Possible Occurrence of Chaos", Economic Letters 23, 1987.

Peters, Edgar, Chaos and Order in the Capital Markets, Second Edition, John Wiley and Sons, 1996.

Ruelle, D. Chaotic Evolution and Strange Attractors. Cambridge University Press, 1989.

Sharpe, W.F. Portfolio Theory and Capital Markets. McGraw-Hill, 1970.

Shiller, R.J., Market Volatility. MIT Press, 1989.

Trippi, Robert, R., Chaos and Nonlinear Dynamics in the Financial Markets, Irwin Professional Publishing,1995.

Walmsley, Julian, The New Financial Instruments, John Wiley and Sons, 1988.
 

Appendix 1: Pseudo Discounted Cash Flow applied to Internet stocks