Category Archives: Research Presented at IEP Events

Articles from The Genesis of Unlicensed Wireless Policy: How Spread Spectrum Devices Won Access to License-Exempt Bandwidth

Awesomize1

The Information Economy Project is proud to present articles that have been published in INFO, Special Issue August 2009, INFO Volume 11, Issue 5 from the Genesis of Unlicensed Wireless Policy: How Spread Spectrum Devices Won Access to License-Exempt Bandwidth Conference held on April 4, 2008:

 

Unlicensed Wireless Policy Conference: Guest Editorial, by Charles L. Jackson, 5 INFO (August 2009)  Unlicensed wireless has become an industry, with hundreds of millions of radios in use today. These devices range from short-range wireless computer keyboards to microwave links with ranges of several miles. Among the most well known are wireless local area networks (WLANs) often referred to as WiFi or 802.11.  This special issue of info presents a collection of papers presented at a George Mason University Law School Conference on “The evolution of unlicensed wireless policy: how spread spectrum devices won access to license-exempt bandwidth” on 4 April 2008. The conference, organized by GMU Law School’s Information Economy Project, reviewed the development of unlicensed wireless policy in the US with the goal of assisting scholars in understanding how current unlicensed policies came into being. It looked at the interplay between regulation and innovation and examined policy initiatives from industry and from inside the government. It also reviewed technological and market responses to changes in regulation.

Unlicensed to Kill: A Brief History of the FCC Part 15 Rules, by Kenneth R. Carter, 5 INFO 8-18 (August 2009)  The Information Economy Project congratulates Kenneth R. Carter, whose paper from the April 2008 IEP Conference, Unleashing Unlicensed, has been awarded the Best Paper of 2009 by the multi-disciplinary journal, info.  Mr. Carter’s paper, “Unlicensed to Kill: A Brief History of the Part 15 Rules,” was published in Volume 11, No. 5 of info, along with the other outstanding articles produced by the scholars and experts who contributed to our highly successful conference at George Mason University, organized by Dr. Charles Jackson.  One would think that a paper on history of unlicensed spectrum ought to be a very short. For one, with except for a very minor section of the Federal Communications Commission’s Part 15 rules, there is no such thing as “unlicensed spectrum”. Rather, the FCC’s Part 15 rules permit radio operation on a sufferance basis in broad swaths of the spectrum which is not allocated specifically to unlicensed use. Second, when compared to other communications policies, the history of the unlicensed rules is rather brief. In the five decades between the establishment of the rules in 1938 and their major revision in 1989, the FCC issued only a handful of proceeding on the issue. The commission’s actions on the subject begin to accelerate apace starting in the early 1990s.  While the unlicensed rules may lack a glorious and romantic past, licensed operation holds great interest for spectrum policy wonks as well as rich issues for the spectrum policy debate. With increasing intensity over the last decade, proponents and opponents in this debate have held forth unlicensed operation as being either pariah or paradigm. Having participated in this debate numerous conferences and events, it seems to me that following syllogism describes the view of spectrum policy researchers toward unlicensed operation. Namely, that unlicensed operation is for economists akin to what the bumblebee is for aeronautical engineers. As the legend goes, according to aerodynamic theory, the length of the bumble bee’s wings is too short for its body and thus, it is not be able to fly. And, yet it does.

Wi-Fi and Bluetooth: The Path from Carter and Reagan-era Faith in Deregulation to Widespread Products Impacting Our World, by Michael J. Marcus, 5 INFO 19-35 (August 2009)  On May 9, 1985 the Federal Communications Commission (FCC), in a meeting that attracted little attention outside the few companies that lobby the agency, adopted a set of rules dealing with the esoteric topic of spread spectrum modulation. But like a seed planted in the ground, these rules resulted in the germination of new classes of products that ultimately had both significant economic impact as well as impact on the daily lives of many people. This decision did not start as an attempt to bring specific products to market, but as part of a program to remove anachronistic technical regulations and allow a free market in innovative technology, subject only to responsible interference limits.

History of Wireless Local Area Networks (WLANs) in the Unlicensed Bands, by Kevin Negus & Al Petrick, 5 INFO 35-56 (August 2009) The wireless local area network (WLAN) is today a ubiquitous device often taken for granted as a default interface for networked devices by users and manufacturers alike. But not very long ago, it was most definitely not so. Rewind the clock ten years back to 1998 and not only are there bitter technical and business consortia differences on WLAN approaches, but there is extreme skepticism and variation in opinion as to how, or even if, WLANs can ever become a mainstream network interface. The WLAN of that day appeared to lack both the throughput of the wired local area network (such as 10/100 Ethernet LAN) and the coverage of the cellular network (which was supposed to be “imminently” upgrading to Mb/s data performance). The WLAN to that point had largely evolved as a slow and unreliable emulation of the wired LAN, only without the wire. And as such the products and standards largely envisioned the end application for WLAN as a replacement for wired LAN in enterprise or campus environments where mobile users would roam with their networked personal computers (PCs).

License-Exempt: The Emergence of Wi-Fi, by Ing Victor Hayes & Ir. Wolter Lemstra, 5 INFO 57-71 (August 2009)  In 1985, this development had been triggered by the US Federal Communications Commission (FCC)[1] when it opened the 915 MHz, the 2.4 and 5.8 GHz bands designated for industrial, scientific and medical (ISM) applications for the use by radio systems, under the condition that spread spectrum techniques would be used (FCC, 1985). Interestingly, the 1980 MITRE report that investigated the potential benefits, costs, and risks of spread spectrum communications on behalf of the FCC did not identify a strong requirement or need from the industry to assign radio frequency (RF) spectrum for spread spectrum based applications. The report concludes that spread spectrum technology is inherently more complex and thus more costly (Mitre Corp., 1980).

Grazing on the Commons: The Emergence of Part 15, by Henry Goldberg, 5 INFO 72-75 (August 2009) What follows is a somewhat impressionistic, highly biased[1] account of how unlicensed radio services moved from being a by-product of the ISM bands to a deliberate spectrum allocation, with clearly defined goals and objectives that could be achieved only by not subjecting the spectrum to licensing or auctions. Like sin itself, the deliberate un-licensing of spectrum began with an Apple. In early 1991, Apple Computer was developing the Newton as the first PDA (Apple invented the term) and was pioneering in the laptop segment of the computer market. Apple believed that wireless connectivity was essential to the success of both products[2].  Accordingly, Apple petitioned the FCC to allocate 40 MHz of spectrum – 1,850-1,890 MHz – out of the 1,850-1,990 MHz band being earmarked for new technologies, particularly PCS. Apple called its proposed new radio service Data-PCS and proposed that it would be devoted exclusively to local area, high speed data communications to support collaborative computing and spontaneous networking among laptops and PDAs. Data-PCS would, in the words of the petition…

Unleashing Innovation: Making the FCC User-Friendly, by Stephen J. Lukasik, 5 INFO 76-85 (August 2009)  There is a large literature on the issue of regulation and technological innovation from the varied perspectives of history, politics, economics, law, finance, and engineering. To attempt to add something meaningful to this rich body of writings is challenging. My only qualification is that of a participant for a short but critical period.  When I found myself, on May 1, 1979, the Chief Scientist of the Federal Communications Commission, twenty-three years after receiving my doctorate from MIT, my training said to decide what the most important problems were that needed fixing and to proceed by whatever promising means suggested themselves to fix them. My technical background was eclectic, the result of broad interests and perhaps a bit of impatience, but quite devoid of experience with the theory or practice of regulation. To understand what happened next on the technology and communication policy side of the FCC, it may be useful to look further into my improbable presence.

Has “Unlicensed” in Part 15 Worked? A Case Study, by Tim Pozar, 5 INFO 86-91 (August 2009)  The Federal Communications Commission established the provisions for unlicensed operations of intentional radiators or transmitters for commutations in what was called the industrial scientific and medical bands. This was a significant change in mindset for the FCC and this case study is meant to show an example of how unlicensed devices have contributed to the community “good”.  The internet became a major economic entity and an essential tool for commerce in the mid to late 1990s. With that, the digital divide was identified as a significant issue by 1996[1]. Typically the digital divide has been the result of cost of the equipment to use the internet, such as computers, as well as the cost or lack of access in connecting to the Internet. Many efforts by local community groups and governments have been made to attack the issue but one problem that they all encountered was addressing the “last mile” to connect the disenfranchised.

 

This entry is part 3 of 9 in the series Unlicensed Wireless Conference 2008

Articles from the Merger Analysis in High Technology Markets Conference

Conf_DSC_0436_medium

The Information Economy Project is proud to present articles that have been published in the Journal of Competition Law & Economics from the Merger Analysis in High Technology Markets Conference held on February 2, 2008:

Technological Change and Merger Policy’s Third Era, by Howard Shelanski (Feb. 1 2008). Excerpt: Changes in Merger Policy Over the Last Century. Evolutionary Changes: Antimonopoly Era (1904-1973), Consumer Welfare Era (1973-2004), Dynamic Efficiency Era (2004-). Cyclical Changes: Merger review has varied in the scope of its objectives: from narrow anti-bigness => broader balance of efficiency and small-business protection => narrow consumer welfare focus => broader balance of static efficiency and innovation.

Market Definition in Online Markets, by Michael Baye, Journal of Competition Law & Economics, 4(3), 639–653 (Sept. 2008). Excerpt: Although the basic principles used to define a relevant market or to analyze unilateral competitive effects in traditional retail settings also apply in online retail markets, several features of the online environment add complexities to the analysis. This paper examines some of the results in the economics and marketing literatures that can influence market definition and competitive effects analysis in online retail settings. I argue that a failure to account properly for certain aspects of online markets can lead to erroneous definitions of the relevant market and, more importantly, erroneous conclusions regarding the unilateral competitive effects of horizontal mergers.

Sky Wars: The Attempted Merger of Dish/DirecTV, by Richard Gilbert (Feb. 1 2008). Excerpt: A High Tech Merger? Relatively new product: High Power Direct Broadcast Satellite TV. DirecTV launched 1994. EchoStar/Dish launched 1996. Large claimed efficiencies. Platform issues. Incompatible encryption formats. Dynamic platform competition. Installed base pricing incentives.

Defining the Relevant Product Market for the Google-DoubleClick Merger
, by Hal Singer & Robert W. Hahn (Feb. 1 2008). Excerpt: Industry Background: In 2007, U.S. advertisers were expected for the first time to spend more on online advertising than on radio advertising. Source: eMarketer. U.S. online advertising revenues in 2007: were roughtly $17 billion, an increase of 35 percent over 2005 revenues. Source: Interactive Advertising Bureau.

Nice Theory, But Where’s the Evidence?: The Use of Economic Evidence to Evaluate Vertical and Conglomerate Mergers in the U.S. and E.U., by Mary T. Coleman (Feb. 1 2008). Excerpt: Overview: Brief description of primary vertical theories of potential competitive concern from a merger. Input foreclosure. Customer foreclosure. Elements for a vertical theory to be plausible. Ability to foreclose. Incentive to foreclose. Foreclosure is likely to harm competition. Efficiencies do not offset. Evidence related to each element.

Horizontal Mergers Among IP Licensors and IP Licensees, by Luke Froeb (Feb. 1 2008). Excerpt: Joint Work: Mike Shor, Steven Tschantz. Disclaimer: Exploratory Analysis. Outline: Motivation: merger analysis. Question 1: Are horizontal merger effects affected by upstream/downstream vertical relationships? Question 2: What Happens when you ignore upstream and/or downstream vertical relationships?

Are ‘Online Markets’ Real and Relevant? From Monster/Hotjobs to Google/DoubleClick, by Bruce D. Abramson, Journal of Competition, Law and Economics (Feb. 1 2008). Excerpt: Key Conclusions: As the novelty of the Internet wears off, on-line merger analysis looks increasingly like off-line merger analysis. Most of the things that make interesting on-line mergers interesting have little to do with competition. A Blast from (My) Past: During the summer of 2001, HotJobs retained my services to support its proposed acquisition by Monster.com. One of the first “major”mergers of Internet “pure plays.” Basic points of interest stemmed from shift in understanding of Internet economics between 2000 (documents) and 2001 (facts). See, From Investor Fantasy to Regulatory Nightmare: Bad Network Economics and the Internet’s Inevitable Monopolists 16 Harv. J. L. Tech. 159 (2002).

Antitrust in Orbit: Some Dynamics of Horizontal Merger Analysis in General and with Respect to XM-Sirius, by Thomas W. Hazlett, Journal of Competition Law & Economics, 4(3), 753–773 (Sept. 2008). Excerpt: Horizontal merger evaluation is heavily reliant on market definition. An SSNIP framework formats the analysis, and demand elasticity evidence used to apply the test is often sparse, as is often found in high-technology industries. This paper examines other sources of evidence that reveal the dynamics of market structure, data that are also probative in the evaluation of competitive effects. These sources include capital valuations of firms, financial event studies, and the public positions taken with respect to the merger by interested parties. Such evidence is examined in the XM–Sirius merger (2007–08) and shown—in two of the three instances—to be relatively informative in merger welfare analysis.

Evaluating Market Power with Two-Sided Demand and Preemptive Offers to Dissipate Monopoly Rent: Lessons for High-Technology Industries from the Proposed Merger of XM and Sirius Satellite Radio, by J. Greg Sidak and Hal J. Singer, Journal of Competition Law & Economics, 4(3), 697–751 (Sept. 2008). Excerpt: Can the standard merger analysis of the Department of Justice’s and Federal Trade Commission’s Horizontal Merger Guidelines accommodate mergers in high-technology industries? In its April 2007 report to Congress, the Antitrust Modernization Commission (AMC) answered that question in the affirmative. Still, some antitrust lawyers and economists advocate exceptions to the rules for particular transactions. In the proposed XM–Sirius merger, for example, proponents argue that the Merger Guidelines be relaxed to accommodate their transaction because satellite radio is a nascent, high-technology industry characterized by “dynamic demand.”

This entry is part 2 of 3 in the series High Tech Merger Conference 2008

Articles from The Crisis in Public Safety Communications Conference

Awesomize17

The Information Economy Project is proud to present articles that have been published in the Federal Communications Law Journal, March 2007 from the Crisis in Public Safety Communications conference held on December 8, 2006:

 

Sending Out an S.O.S.: Public Safety Communications Interoperability as a Collective Action Problem, by Jerry Brito, 59 Federal Communications Law Journal 457-92 (2007), Quick Links: Crisis in Public Safety Communications Conference. Excerpt: On September 11, 2001, officers from the New York City police and fire departments responded to the attacks on the World Trade Center. That morning, police and firefighters entered each of the Twin Towers in an effort to help those inside. Shortly after the South Tower collapsed, an officer in a police helicopter hovering over the scene radioed to his colleagues, “About 15 floors down from the top, it looks like it’s glowing red. It’s inevitable.”1 Then another police pilot reported, “I don’t think this has too much longer to go. I would evacuate all people within the area of that second building.”2

Solving the Interoperability Problem: Are We On the Same Channel? An Essay on the Problems and Prospects for Public Safety Radio, by Gerald R. Faulhaber, 59 Federal Communications Law Journal 493-516 (2007), Quick Links: Crisis in Public Safety Communications Conference, Gerald Faulhaber. Excerpt: Public safety radio communication provides the essential link by which fire, police, emergency medical services (“EMS”), and other emergency personnel respond to life- and property-threatening situations. Communications enables the situational awareness, command, and operational control without which the response of multiple agencies to an emergency is less than useless. Key to this communications capability is interoperability: the capability of first responders from different agencies to communicate during emergencies.

Fundamental Reform in Public Safety Communications Policy, by Jon M. Peha, 59 Federal Communications Law Journal 517-46 (2007), Quick Links: Crisis in Public Safety Communications Conference. Excerpt: All across the country, there have been failures in the communications systems used by first responders, such as firefighters, police, paramedics, and the National Guard. These failures can cost lives in emergencies both large and small. This problem has gained particular attention in the tragic aftermaths of the 9/11 attacks1 and Hurricane Katrina,2 when inadequacies in the current system were particularly obvious, but attention has not yet translated to significant progress.

Communicating During Emergencies: Toward Interoperability and Effective Information Management, by Philip J. Weiser, 59 Federal Communications Law Journal 547-74 (2007), Quick Links: Crisis in Public Safety Communications Conference. Excerpt: The crisis of communications on 9/11 and in the aftermath of Hurricane Katrina underscores that emergency responders are largely illequipped to communicate effectively in times of disaster as well as in day-to-day emergency situations that require the coordination of several different public safety agencies. The reason for this state of affairs is that public safety agencies traditionally have made individualized decisions about information and communications technology,1 generally failing to purchase state-of-the-art technology that operates effectively and interoperates with others involved in emergency response.

This entry is part 2 of 6 in the series Public Safety Conference 2006