What principles of governance does spectrum policy need?

What lessons can be learned for spectrum policy from the management of other natural resources? Here, an expert on resource management says good governance depends on a transparent, rules-based approach that will minimise regulatory uncertainty. This stability is key to encouraging the necessary investment in networks.

Read the full research here.

Articles from Tragedies of the Gridlock Economy Conference

The Information Economy Project is proud to present articles that have been published in the Arizona Law Review, Volume 53, from the Tragedies of the Gridlock Economy: How Mis-Configuring Property Rights Stymies Social Efficiency held on October 2, 2009:Symposium — Tragedies of the Gridlock Economy: How Mis-Configuring Property Rights Stymies Social Efficiency*

Volume 53, Issue 1, Arizona Law Review

Transaction Cost and the Organization of Ownership—An Introduction
Harold Demsetz | 53 Ariz. L. Rev. 1 (2011) | PDF

Exclusion and Exclusivity in Gridlock
Eric R. Claeys | 53 Ariz. L. Rev. 9 (2011) | PDF

Heller’s Gridlock Economy in Perspective: Why There Is Too Little, Not Too Much Private Property
Richard A. Epstein | 53 Ariz. L. Rev. 51 (2011) | PDF

Tragedy TV: Rights Fragmentation and the Junk Band Problem
Thomas W. Hazlett | 53 Ariz. L. Rev. 83 (2011) | PDF

Google Book Search in the Gridlock Economy
Doug Lichtman | 53 Ariz. L. Rev. 131 (2011) | PDF

Autonomy and Independence: The Normative Face of Transaction Costs
Robert P. Merges | 53 Ariz. L. Rev. 145 (2011) | PDF

The Rise and Fall of the First American Patent Thicket: The Sewing Machine War of the 1850s
Adam Mossoff | 53 Ariz. L. Rev. 165 (2011) | PDF

The Wasteland: Anticommons, White Spaces, and the Fallacy of Spectrum
Kevin Werbach | 53 Ariz. L. Rev. 213 (2011) | PDF


Conference Articles and Abstracts:

“Heller’s Gridlock Economy In Perspective” by Richard A. Epstein, 53 Ariz. L. Rev. 51 (2011), October 2, 2009 (paper presented at the IEP Conference on the Gridlock Economy). “The topic of this conference is Michael Heller’s provocative new book on The Gridlock Economy.1 The central thesis of the book is that one critical obstacle to overall social advancement is the fragmentation of property among private owners that prevents its coherent assembly for projects that are desired by all but achievable by none. There is no question that, more than anyone else, Heller has put this topic on the map in its current form, chiefly through two earlier academic articles which have had immense influence on the field.2 The ability to introduce into the mature field of law and economics even a single new generative term, the anticommons on which Gridlock is based, is a major intellectual achievement…”

Exclusion and Exclusivity in Gridlock,” by Eric R. Claeys, 53 Ariz. L. Rev. 9 (2011). “Michael Heller earned respect among property scholars in his 1998 article The Tragedy of the Anticommons: Property in the Transition from Marx to Markets. The conception of a “tragedy of the commons” had been popularized by Garrett Hardin in a 1968 article by that name. When ranchers have open access (a commons) to grass, their cattle tend to overeat it (the tragedy). Harold Demsetz provided the canonical economic response to tragedies of the commons: private property. Exclusive rights of control, use, and disposition (“exclusive possessory rights”) encourage owners to internalize externalities associated with the over-consumption of resources held in common…”

Spectrum Policy:

“Tragedy T.V.: Rights Fragmentation and the Junk Band Problem” by Thomas W. Hazlett, 53 Ariz. L. Rev. 83 (2011), October 2, 2009 (paper presented at the IEP Conference on the Gridlock Economy). “Tragedy of the anti-commons occurs when property rules fail to enable efficient social coordination. In radio spectrum, rights issued to airwave users have traditionally been severely truncated, leaving gains from trade unexploited. The social losses that Ronald Coase (1959) asserted, appealing to basic theories of resource allocation, are now revealed via intense under-utilization of the TV Band…”

The Wasteland: Anticommons, White Spaces, and the Fallacy of Spectrum” by Kevin Werbach, 53 Ariz. L. Rev. 213 (2011), October 2, 2009 (paper presented at the IEP Conference on the Gridlock Economy). “I urge you, I urge you to put the people’s airwaves to the service of the people and the cause of freedom. You must help prepare a generation for great decisions. You must help a great nation fulfill its future. Do this! I pledge you our help.”1 Federal Communications Commission (FCC) Chairman Newton Minow’s 1961 address to the National Association of Broadcasters is legendary for its caustic dismissal of television as a “vast wasteland.”2 Yet Minow intended to emphasize a different two-word phrase: “public interest.”3 Television was the most prominent use of “the people’s airwaves” — the government-defined capacity for wireless communication — and it was failing to serve national interests.4…

Google Book Search:

Google Book Search in the Gridlock Economy” by Doug Lichtman, 53 Ariz. L. Rev. 131 (2011), October 2, 2009 (paper presented at the IEP Conference on the Gridlock Economy). “Michael Heller’s Gridlock Economy popularizes a concept that Heller has developed over nearly two decades of influential academic writing: the notion that, when it comes to property rights, too many rights-endowed cooks really can spoil the broth. I was asked in this conference to apply Heller’s insight to the Google Book Search project, and the request at first seemed natural. Heller himself has suggested that Google Book Search might be an apt poster child for the gridlock phenomenon; and Google likewise can often be heard to complain, in Heller-esque tones, that the only way to build a comprehensive search engine for books is to take the books without asking….”

Autonomy and Independence: The Normative Face of Transaction Costs” by Robert P. Merges, 53 Ariz. L. Rev. 145 (2011), October 2, 2009 (paper presented at the IEP Conference on the Gridlock Economy). “Anticommons theory made a splash, and is today being expanded and applied, because it shifted our collective attention in a crucial way. Before the 1990s, the big policy questions in IP were all about individual IP rights: when should a copyright or patent be granted, when denied? Anticommons theory burst into this conventional conversation like an unruly drunk at a ballet recital. It demanded attention. It said, in effect, “you may mean well, but you’re missing the big point. You’re wasting your time!” The big point is not the individual grant of an IP right. It’s the aggregate impact of granting many rights to many discrete and independent right-holders…”

Luncheon Keynote:

On Being Misled by Transaction Cost Economics: Externalities, Commons, and Gridlocks” by Harold Demsetz, 53 Ariz. L. Rev. 1 (2011), October 2, 2009 (paper presented at the IEP Conference on the Gridlock Economy). “During the last half-century transaction cost became a prominent consideration in discussions about externalities and ownership arrangements. The author of this essay contributed to this development in the earlier part of this half-century but has since come to doubt the importance of transaction cost and even the roles it is thought to play in these two areas of economic thought. A succinct statement of this doubt as it pertains to the externality problem is a primary task of this essay. The last part of the essay questions the dominant position given to transaction cost in discussions of ownership forms that now go by the names of commons, anti-commons, and gridlocks…”

Patent Reform:

The Rise and Fall of the First Patent Thicket: The Sewing Machine War of the 1850s” by Adam Mossoff, 53 Ariz. L. Rev. 165 (2011), March 2010. “After Professor Michael Heller proposed that excessively fragmented property rights in land can frustrate its commercial development, patent scholars have debated vigorously whether Heller’s anticommons theory applies to property rights in inventions. Do these “patent thickets” exist, and if so, what are the best solutions? This article contributes to this debate by analyzing the rise and fall of the first American patent thicket: the “Sewing Machine War” of the 1850s…”


This entry is part 2 of 8 in the series Gridlock Economy Conference 2009

Articles from Markets, Firms and Property Rights: A Celebration of the Research of Ronald Coase

The Information Economy Project is proud to present articles that will be published in a special joint issue of the Journal of Law & Economics and Journal of Legal Studies, from the Markets, Firms and Property Rights: A Celebration of the Research of Ronald Coase held on December 4-5, 2009:

Conference Articles:

Friday Sessions

The Effect of Allowance Allocation” by Robert W. Hahn & Robert N. Stavins.  We begin with “The Problem of Social Cost” (1960) … The Coase Theorem: Bilateral negotiation between the generator and recipient of an externality leads to the same efficient outcome regardless of the initial assignment of property rights (if no transaction costs, income effects, or third-party impacts).

Coase, Transaction Costs, and the Spread of the Rectangular Survey for Land Demarcation within the British Empire” by Gary D. Libecap, Dean Lueck, Trevor O’Grady.  This paper examines adoption of the rectangular system (RS) of land demarcation within European settlement colonies of the British Empire in the 18th and 19th centuries. This was a time when agricultural land markets were first developing on a widespread scale. These jurisdictions had similar immigrant populations and legal structures, but their land demarcation practices were quite different.

Coase and the New Zealand Spectrum Reforms” by Charles L. Jackson. In 1989, New Zealand’s Parliament enacted a new statute, the Radiocommunications Act 1989, that explicitly used a system of property rights to regulate the use of the radio spectrum. This statute resulted in the first ever spectrum auctions – and New Zealand’s use of auctions has been copied around the globe. New Zealand’s adoption of a property rights regime, a more fundamental change than the introduction of spectrum auctions, has not had the same wide acceptance.

Radio Spectrum and the Disruptive Clarity of Ronald Coase” by Thomas W. Hazlett, David Porter, Vernon Smith. In the Federal Communications Commission,5 Ronald Coase exposed deep foundations via normative argument buttressed by astute historical observation. The government controlled scarce frequencies, issuing sharply limited use rights. Spillovers were said to be otherwise endemic. Coase saw that Government limited conflicts by restricting uses; property owners perform an analogous function via the “price system.” The government solution was inefficient unless the net benefits of the alternative property regime were lower.

Why the Entry Regulation of the China Mobile Phone Manufacturing Industry Collapsed” by Zhimin Liao, Xiaofang Chen. This case study aims to explore an interesting puzzle: why the license regulation in China’s mobile phone production industry, which generated large rents for an once powerful interest group, was suddenly eliminated.

How to Keep a Secret: The Decisive Advantage of Corporations” by Robert Cooter.  In the 1950s socialists around the world built gigantic steel plants like Nowa Huta in Poland. By the 1980s they were losing vast amounts of money and they seemed destined to die a slow death by rust. Lakshmi Mittal, who led the international operation of an Indian steel business built by his father, believed that these industrial dinosaurs could flourish in the age of mammals.

Regulatory Institutions and Economic Performance: Wireless Communications in Middle-Income Developing Nations” by Roger Noll.  Wireless Success Story: Over 4 billion wireless subscribers worldwide in 2009 (compared to 1.3 billion wire lines) Wireless penetration in developing nations around 50% of population, over 100% in some middle income nations (several higher than US = 90%) 2. Generally good1 policies in nations not noted for good economic policies Mostly privatized Multiple firms (3+ in most nations) Permit foreign ownership Narrow, targeted regulation Licenses transferable.

Saturday Sessions

Competence as a Random Variable: One More Tribute to Ronald Coase” by Richard A. Epstein. The work of Ronald Coase is notable for how it introduces the notion of transactions costs to explain both the creation and maintenance of firms and for understanding the larger question of social costs. Nonetheless, it seems improbable that positive transaction costs are the only explanation as to why and how firms are organized.

R.H. Coase and the Neoclassical Model of the Economic System” by Harold Demsetz. It is clear from articles I have written for the New Palgrave Dictionary of Law and Economics and other publications that I have high regard for Coase and his works. Some would say I have published parts of his works more times than has he. True or not, my role in explaining, defending, and extending Ronald’s writings has left me with little to say that is different from what I have already written, so my theme today is not a product of conscious deliberation.

Measuring Coase’s Influence” by William M. Landes and Sonia Lahr-Pastor. Citations measure a scholar’s influence. That Ronald Coase is among the most influential and best-cited economists in the past fifty years is not debatable. Two of his articles, “The Nature of the Firm”, published in 1937, and “The Problem of Social Cost”, published in 1960, are among the most-cited articles in both economics and law and continue to be widely cited.

Regulation and the Nature of the Firm: The Case of U.S. Regional Airlines” by Michael E. Levine. The organization of airline networks, and particularly of the interaction between the less dense parts of the network with the more dense parts, is a particularly good example of the operation of two of Professor Coase’s main points in “The Nature of the Firm” and subsequent articles: first, that the choice of institutions chosen to organize production is a function of economic circumstances, including regulation, technology and contractual arrangements inside the firm and second, that there is no general outcome that economic theory predicts, but rather that the result always depends on the particular circumstances and choices available and that it will change as circumstances change.

Commercial Advertising and the First Amendment” by Geoffrey R. Stone. In his path-breaking 1977 article, Advertising and Free Speech, Ronald Coase challenged the conventional wisdom in an important area of First Amendment law. What especially interested Coase was the sharp divergence between the profound commitment to the free market in the realm of speech and the lack of confidence in the free market in the realm of goods and services. Invoking Justice Holmes’s claim that “the best test of truth is the power of the thought to get itself accepted in the competition of the market,” Coase noted that First Amendment doctrine is largely premised on “an extreme faith in the efficiency of competitive markets and a profound distrust of government regulation.” But in the realm of “goods and services,” the same “intellectual community” that celebrates the marketplace of ideas demands ever-more extensive government regulation. Coase suggested that this disparity “calls for an explanation,” but lamented that such an explanation “is not easy to find.”

Keynes and Coase” by Richard A. Posner.  I am sure that Ronald will not like my bracketing him with Keynes, as I am about to do. But if he is patient, he will hear me modify criticisms of his approach to economics that I made in an essay I wrote many years ago – sixteen to be exact – for the Journal of Economic Perspectives.


This entry is part 2 of 5 in the series Coase Conference 2009

Articles from The Genesis of Unlicensed Wireless Policy: How Spread Spectrum Devices Won Access to License-Exempt Bandwidth

The Information Economy Project is proud to present articles that have been published in INFO, Special Issue August 2009, INFO Volume 11, Issue 5 from the Genesis of Unlicensed Wireless Policy: How Spread Spectrum Devices Won Access to License-Exempt Bandwidth Conference held on April 4, 2008:


Unlicensed Wireless Policy Conference: Guest Editorial, by Charles L. Jackson, 5 INFO (August 2009)  Unlicensed wireless has become an industry, with hundreds of millions of radios in use today. These devices range from short-range wireless computer keyboards to microwave links with ranges of several miles. Among the most well known are wireless local area networks (WLANs) often referred to as WiFi or 802.11.  This special issue of info presents a collection of papers presented at a George Mason University Law School Conference on “The evolution of unlicensed wireless policy: how spread spectrum devices won access to license-exempt bandwidth” on 4 April 2008. The conference, organized by GMU Law School’s Information Economy Project, reviewed the development of unlicensed wireless policy in the US with the goal of assisting scholars in understanding how current unlicensed policies came into being. It looked at the interplay between regulation and innovation and examined policy initiatives from industry and from inside the government. It also reviewed technological and market responses to changes in regulation.

Unlicensed to Kill: A Brief History of the FCC Part 15 Rules, by Kenneth R. Carter, 5 INFO 8-18 (August 2009)  The Information Economy Project congratulates Kenneth R. Carter, whose paper from the April 2008 IEP Conference, Unleashing Unlicensed, has been awarded the Best Paper of 2009 by the multi-disciplinary journal, info.  Mr. Carter’s paper, “Unlicensed to Kill: A Brief History of the Part 15 Rules,” was published in Volume 11, No. 5 of info, along with the other outstanding articles produced by the scholars and experts who contributed to our highly successful conference at George Mason University, organized by Dr. Charles Jackson.  One would think that a paper on history of unlicensed spectrum ought to be a very short. For one, with except for a very minor section of the Federal Communications Commission’s Part 15 rules, there is no such thing as “unlicensed spectrum”. Rather, the FCC’s Part 15 rules permit radio operation on a sufferance basis in broad swaths of the spectrum which is not allocated specifically to unlicensed use. Second, when compared to other communications policies, the history of the unlicensed rules is rather brief. In the five decades between the establishment of the rules in 1938 and their major revision in 1989, the FCC issued only a handful of proceeding on the issue. The commission’s actions on the subject begin to accelerate apace starting in the early 1990s.  While the unlicensed rules may lack a glorious and romantic past, licensed operation holds great interest for spectrum policy wonks as well as rich issues for the spectrum policy debate. With increasing intensity over the last decade, proponents and opponents in this debate have held forth unlicensed operation as being either pariah or paradigm. Having participated in this debate numerous conferences and events, it seems to me that following syllogism describes the view of spectrum policy researchers toward unlicensed operation. Namely, that unlicensed operation is for economists akin to what the bumblebee is for aeronautical engineers. As the legend goes, according to aerodynamic theory, the length of the bumble bee’s wings is too short for its body and thus, it is not be able to fly. And, yet it does.

Wi-Fi and Bluetooth: The Path from Carter and Reagan-era Faith in Deregulation to Widespread Products Impacting Our World, by Michael J. Marcus, 5 INFO 19-35 (August 2009)  On May 9, 1985 the Federal Communications Commission (FCC), in a meeting that attracted little attention outside the few companies that lobby the agency, adopted a set of rules dealing with the esoteric topic of spread spectrum modulation. But like a seed planted in the ground, these rules resulted in the germination of new classes of products that ultimately had both significant economic impact as well as impact on the daily lives of many people. This decision did not start as an attempt to bring specific products to market, but as part of a program to remove anachronistic technical regulations and allow a free market in innovative technology, subject only to responsible interference limits.

History of Wireless Local Area Networks (WLANs) in the Unlicensed Bands, by Kevin Negus & Al Petrick, 5 INFO 35-56 (August 2009) The wireless local area network (WLAN) is today a ubiquitous device often taken for granted as a default interface for networked devices by users and manufacturers alike. But not very long ago, it was most definitely not so. Rewind the clock ten years back to 1998 and not only are there bitter technical and business consortia differences on WLAN approaches, but there is extreme skepticism and variation in opinion as to how, or even if, WLANs can ever become a mainstream network interface. The WLAN of that day appeared to lack both the throughput of the wired local area network (such as 10/100 Ethernet LAN) and the coverage of the cellular network (which was supposed to be “imminently” upgrading to Mb/s data performance). The WLAN to that point had largely evolved as a slow and unreliable emulation of the wired LAN, only without the wire. And as such the products and standards largely envisioned the end application for WLAN as a replacement for wired LAN in enterprise or campus environments where mobile users would roam with their networked personal computers (PCs).

License-Exempt: The Emergence of Wi-Fi, by Ing Victor Hayes & Ir. Wolter Lemstra, 5 INFO 57-71 (August 2009)  In 1985, this development had been triggered by the US Federal Communications Commission (FCC)[1] when it opened the 915 MHz, the 2.4 and 5.8 GHz bands designated for industrial, scientific and medical (ISM) applications for the use by radio systems, under the condition that spread spectrum techniques would be used (FCC, 1985). Interestingly, the 1980 MITRE report that investigated the potential benefits, costs, and risks of spread spectrum communications on behalf of the FCC did not identify a strong requirement or need from the industry to assign radio frequency (RF) spectrum for spread spectrum based applications. The report concludes that spread spectrum technology is inherently more complex and thus more costly (Mitre Corp., 1980).

Grazing on the Commons: The Emergence of Part 15, by Henry Goldberg, 5 INFO 72-75 (August 2009) What follows is a somewhat impressionistic, highly biased[1] account of how unlicensed radio services moved from being a by-product of the ISM bands to a deliberate spectrum allocation, with clearly defined goals and objectives that could be achieved only by not subjecting the spectrum to licensing or auctions. Like sin itself, the deliberate un-licensing of spectrum began with an Apple. In early 1991, Apple Computer was developing the Newton as the first PDA (Apple invented the term) and was pioneering in the laptop segment of the computer market. Apple believed that wireless connectivity was essential to the success of both products[2].  Accordingly, Apple petitioned the FCC to allocate 40 MHz of spectrum – 1,850-1,890 MHz – out of the 1,850-1,990 MHz band being earmarked for new technologies, particularly PCS. Apple called its proposed new radio service Data-PCS and proposed that it would be devoted exclusively to local area, high speed data communications to support collaborative computing and spontaneous networking among laptops and PDAs. Data-PCS would, in the words of the petition…

Unleashing Innovation: Making the FCC User-Friendly, by Stephen J. Lukasik, 5 INFO 76-85 (August 2009)  There is a large literature on the issue of regulation and technological innovation from the varied perspectives of history, politics, economics, law, finance, and engineering. To attempt to add something meaningful to this rich body of writings is challenging. My only qualification is that of a participant for a short but critical period.  When I found myself, on May 1, 1979, the Chief Scientist of the Federal Communications Commission, twenty-three years after receiving my doctorate from MIT, my training said to decide what the most important problems were that needed fixing and to proceed by whatever promising means suggested themselves to fix them. My technical background was eclectic, the result of broad interests and perhaps a bit of impatience, but quite devoid of experience with the theory or practice of regulation. To understand what happened next on the technology and communication policy side of the FCC, it may be useful to look further into my improbable presence.

Has “Unlicensed” in Part 15 Worked? A Case Study, by Tim Pozar, 5 INFO 86-91 (August 2009)  The Federal Communications Commission established the provisions for unlicensed operations of intentional radiators or transmitters for commutations in what was called the industrial scientific and medical bands. This was a significant change in mindset for the FCC and this case study is meant to show an example of how unlicensed devices have contributed to the community “good”.  The internet became a major economic entity and an essential tool for commerce in the mid to late 1990s. With that, the digital divide was identified as a significant issue by 1996[1]. Typically the digital divide has been the result of cost of the equipment to use the internet, such as computers, as well as the cost or lack of access in connecting to the Internet. Many efforts by local community groups and governments have been made to attack the issue but one problem that they all encountered was addressing the “last mile” to connect the disenfranchised.


This entry is part 3 of 9 in the series Unlicensed Wireless Conference 2008

Articles from the Merger Analysis in High Technology Markets Conference

The Information Economy Project is proud to present articles that have been published in the Journal of Competition Law & Economics from the Merger Analysis in High Technology Markets Conference held on February 2, 2008:

Technological Change and Merger Policy’s Third Era, by Howard Shelanski (Feb. 1 2008). Excerpt: Changes in Merger Policy Over the Last Century. Evolutionary Changes: Antimonopoly Era (1904-1973), Consumer Welfare Era (1973-2004), Dynamic Efficiency Era (2004-). Cyclical Changes: Merger review has varied in the scope of its objectives: from narrow anti-bigness => broader balance of efficiency and small-business protection => narrow consumer welfare focus => broader balance of static efficiency and innovation.

Market Definition in Online Markets, by Michael Baye, Journal of Competition Law & Economics, 4(3), 639–653 (Sept. 2008). Excerpt: Although the basic principles used to define a relevant market or to analyze unilateral competitive effects in traditional retail settings also apply in online retail markets, several features of the online environment add complexities to the analysis. This paper examines some of the results in the economics and marketing literatures that can influence market definition and competitive effects analysis in online retail settings. I argue that a failure to account properly for certain aspects of online markets can lead to erroneous definitions of the relevant market and, more importantly, erroneous conclusions regarding the unilateral competitive effects of horizontal mergers.

Sky Wars: The Attempted Merger of Dish/DirecTV, by Richard Gilbert (Feb. 1 2008). Excerpt: A High Tech Merger? Relatively new product: High Power Direct Broadcast Satellite TV. DirecTV launched 1994. EchoStar/Dish launched 1996. Large claimed efficiencies. Platform issues. Incompatible encryption formats. Dynamic platform competition. Installed base pricing incentives.

Defining the Relevant Product Market for the Google-DoubleClick Merger
, by Hal Singer & Robert W. Hahn (Feb. 1 2008). Excerpt: Industry Background: In 2007, U.S. advertisers were expected for the first time to spend more on online advertising than on radio advertising. Source: eMarketer. U.S. online advertising revenues in 2007: were roughtly $17 billion, an increase of 35 percent over 2005 revenues. Source: Interactive Advertising Bureau.

Nice Theory, But Where’s the Evidence?: The Use of Economic Evidence to Evaluate Vertical and Conglomerate Mergers in the U.S. and E.U., by Mary T. Coleman (Feb. 1 2008). Excerpt: Overview: Brief description of primary vertical theories of potential competitive concern from a merger. Input foreclosure. Customer foreclosure. Elements for a vertical theory to be plausible. Ability to foreclose. Incentive to foreclose. Foreclosure is likely to harm competition. Efficiencies do not offset. Evidence related to each element.

Horizontal Mergers Among IP Licensors and IP Licensees, by Luke Froeb (Feb. 1 2008). Excerpt: Joint Work: Mike Shor, Steven Tschantz. Disclaimer: Exploratory Analysis. Outline: Motivation: merger analysis. Question 1: Are horizontal merger effects affected by upstream/downstream vertical relationships? Question 2: What Happens when you ignore upstream and/or downstream vertical relationships?

Are ‘Online Markets’ Real and Relevant? From Monster/Hotjobs to Google/DoubleClick, by Bruce D. Abramson, Journal of Competition, Law and Economics (Feb. 1 2008). Excerpt: Key Conclusions: As the novelty of the Internet wears off, on-line merger analysis looks increasingly like off-line merger analysis. Most of the things that make interesting on-line mergers interesting have little to do with competition. A Blast from (My) Past: During the summer of 2001, HotJobs retained my services to support its proposed acquisition by Monster.com. One of the first “major”mergers of Internet “pure plays.” Basic points of interest stemmed from shift in understanding of Internet economics between 2000 (documents) and 2001 (facts). See, From Investor Fantasy to Regulatory Nightmare: Bad Network Economics and the Internet’s Inevitable Monopolists 16 Harv. J. L. Tech. 159 (2002).

Antitrust in Orbit: Some Dynamics of Horizontal Merger Analysis in General and with Respect to XM-Sirius, by Thomas W. Hazlett, Journal of Competition Law & Economics, 4(3), 753–773 (Sept. 2008). Excerpt: Horizontal merger evaluation is heavily reliant on market definition. An SSNIP framework formats the analysis, and demand elasticity evidence used to apply the test is often sparse, as is often found in high-technology industries. This paper examines other sources of evidence that reveal the dynamics of market structure, data that are also probative in the evaluation of competitive effects. These sources include capital valuations of firms, financial event studies, and the public positions taken with respect to the merger by interested parties. Such evidence is examined in the XM–Sirius merger (2007–08) and shown—in two of the three instances—to be relatively informative in merger welfare analysis.

Evaluating Market Power with Two-Sided Demand and Preemptive Offers to Dissipate Monopoly Rent: Lessons for High-Technology Industries from the Proposed Merger of XM and Sirius Satellite Radio, by J. Greg Sidak and Hal J. Singer, Journal of Competition Law & Economics, 4(3), 697–751 (Sept. 2008). Excerpt: Can the standard merger analysis of the Department of Justice’s and Federal Trade Commission’s Horizontal Merger Guidelines accommodate mergers in high-technology industries? In its April 2007 report to Congress, the Antitrust Modernization Commission (AMC) answered that question in the affirmative. Still, some antitrust lawyers and economists advocate exceptions to the rules for particular transactions. In the proposed XM–Sirius merger, for example, proponents argue that the Merger Guidelines be relaxed to accommodate their transaction because satellite radio is a nascent, high-technology industry characterized by “dynamic demand.”

This entry is part 2 of 3 in the series High Tech Merger Conference 2008

Articles from The Crisis in Public Safety Communications Conference

The Information Economy Project is proud to present articles that have been published in the Federal Communications Law Journal, March 2007 from the Crisis in Public Safety Communications conference held on December 8, 2006:


Sending Out an S.O.S.: Public Safety Communications Interoperability as a Collective Action Problem, by Jerry Brito, 59 Federal Communications Law Journal 457-92 (2007), Quick Links: Crisis in Public Safety Communications Conference. Excerpt: On September 11, 2001, officers from the New York City police and fire departments responded to the attacks on the World Trade Center. That morning, police and firefighters entered each of the Twin Towers in an effort to help those inside. Shortly after the South Tower collapsed, an officer in a police helicopter hovering over the scene radioed to his colleagues, “About 15 floors down from the top, it looks like it’s glowing red. It’s inevitable.”1 Then another police pilot reported, “I don’t think this has too much longer to go. I would evacuate all people within the area of that second building.”2

Solving the Interoperability Problem: Are We On the Same Channel? An Essay on the Problems and Prospects for Public Safety Radio, by Gerald R. Faulhaber, 59 Federal Communications Law Journal 493-516 (2007), Quick Links: Crisis in Public Safety Communications Conference, Gerald Faulhaber. Excerpt: Public safety radio communication provides the essential link by which fire, police, emergency medical services (“EMS”), and other emergency personnel respond to life- and property-threatening situations. Communications enables the situational awareness, command, and operational control without which the response of multiple agencies to an emergency is less than useless. Key to this communications capability is interoperability: the capability of first responders from different agencies to communicate during emergencies.

Fundamental Reform in Public Safety Communications Policy, by Jon M. Peha, 59 Federal Communications Law Journal 517-46 (2007), Quick Links: Crisis in Public Safety Communications Conference. Excerpt: All across the country, there have been failures in the communications systems used by first responders, such as firefighters, police, paramedics, and the National Guard. These failures can cost lives in emergencies both large and small. This problem has gained particular attention in the tragic aftermaths of the 9/11 attacks1 and Hurricane Katrina,2 when inadequacies in the current system were particularly obvious, but attention has not yet translated to significant progress.

Communicating During Emergencies: Toward Interoperability and Effective Information Management, by Philip J. Weiser, 59 Federal Communications Law Journal 547-74 (2007), Quick Links: Crisis in Public Safety Communications Conference. Excerpt: The crisis of communications on 9/11 and in the aftermath of Hurricane Katrina underscores that emergency responders are largely illequipped to communicate effectively in times of disaster as well as in day-to-day emergency situations that require the coordination of several different public safety agencies. The reason for this state of affairs is that public safety agencies traditionally have made individualized decisions about information and communications technology,1 generally failing to purchase state-of-the-art technology that operates effectively and interoperates with others involved in emergency response.

This entry is part 2 of 6 in the series Public Safety Conference 2006