Who wants to be a millionaire? How the press framed the role of the public in the dispute between time Warner cable and Disney's ABC network in may 2000
At midnight on May 1, 2000, Time Warner Cable dropped the signal of Disney's ABC network from its systems in seven major markets. The signal was unavailable for approximately 39 hours, affecting roughly 3.5 million subscribers. While lawyers and regulators pondered the retransmission consent provisions of the Cable Act of 1992 and the accompanying FCC rules, the press was called upon to report and interpret these issues to the public. This paper examines the frames present in journalistic accounts triggered by the loss of ABC's signal in five mass newspapers and five trade publications. Specifically, the paper addresses the question: How did the press portray the role of the public in relation to the Time Warner ‐ Disney dispute? The data presented in the paper strongly suggest that the press failed to develop a cogent discussion of the underlying policy issues in a way useful to the public. The analysis of the press coverage concludes that the public was framed as passive and was largely excluded from the policy debate around the dispute.
- Book Chapter
- 10.1007/978-3-030-02363-8_18
- Nov 30, 2018
Charter Communications acquisition of Time Warner Cable and Bright House Networks created the second largest US broadband provider and third largest pay-TV provider. The combination of Charter, Time Warner Cable, and Bright House Networks had the capacity to serve over 25 million customers in approximately 41 states in the United States. The combined company was marketed under the brand name Spectrum. Time Warner had more than 14 million customers. Bright House served more than two million customers. This acquisition of Time Warner Cable and Bright House Networks made Charter the second largest home Internet provider in the United States. After the acquisition of Time Warner Cable, Charter became the second largest cable operator behind Comcast. The new company had 24 million subscribers which included 17.4 million video subscribers. The new company became the third largest pay-TV company which served more than 17 million video customers behind AT&T and Comcast. The deal valued Time Warner Cable at $78.7 billion. Charter provided $100 in cash and shares of the new public parent company (“New Charter”) which was equivalent to 0.5409 shares of Charter for each Time Warner share outstanding (Khatchatourian, 2016). Bright House was acquired for $10.4 billion. The deal increased the combined firms’ broadband network capacity which resulted in faster broadband speeds, better video products, and affordable phone service. The combination of Charter, Time Warner Cable, and Bright House created a leading broadband services and technology company which served about 23.9 million customers in 41 states. The revenues increased by approximately 43% in the post-merger year 2017. The cumulative returns for Charter Communications during the 251-day period surrounding the merger announcement (−5 day to +245 day) was 22.4%.
- Book Chapter
- 10.1016/b978-0-12-805475-8.00020-3
- Jan 1, 2017
- Strategic Financial Management Casebook
17 - Analysis of wealth—Time Warner Inc.
- Research Article
- 10.2139/ssrn.1596255
- Apr 22, 2010
- SSRN Electronic Journal
In this paper, we disprove claims by proponents of increased Internet regulation that Broadband Service Providers (“BSPs”) - such as AT&T, Verizon, Sprint-Nextel, Qwest, Comcast, and Time Warner Cable - make “record profits,” “substantial profits,” and “soaring profits.” By reference to the most widely used measures of profitability, we find that the data show that the profitability of the larger BSPs is generally equal to or below average for firms in the S&P 500. Thus, “’typical’ is more accurate than ‘substantial’ as a description of these profits.” As for “record” or “soaring profits,” the data show that 2009 profitability ratios are largely typical of recent profitability for the larger BSPs. The data further show that the profitability of the larger BSPs has been relatively stable over the past five years. We additionally demonstrate that content firms like Google and EBay are substantially more profitable than BSPs, implying the BSPs are not benefiting as much as others in the Internet eco-system from the surge in broadband adoption and use. Across all measures of profitability, Google and Ebay are two-to-four times more profitable than the better performing broadband providers. To put these numbers into context, we also looked at the profitability of several large firms outside of the broadband space. We found that, for example, both Wal-Mart and Colgate-Palmolive have much higher profits than BSPs (with the exception of net profit margin for Wal-Mart), leading us to conclude that “[s]elling consumers staples and toothpaste appears to be more profitable than selling them broadband connections.”
- Research Article
15
- 10.2139/ssrn.2756888
- Sep 5, 2016
- SSRN Electronic Journal
Recent Internet interconnection disputes have sparked an increased interest in developing methods for gathering and collecting data about utilization at interconnection points. One mechanism, developed by DeepField Networks, allows Internet service providers (ISPs) to gather and aggregate utilization information using network flow statistics, standardized in the Internet Engineering Task Force as IPFIX. This report (1) provides an overview of the method that DeepField Networks is using to measure the utilization of various interconnection links between content providers and ISPs or links over which traffic between content and ISPs flow; and (2) surveys the findings from five months of Internet utilization data provided by seven participating ISPs — Bright House Networks, Comcast, Cox, Mediacom, Midco, Suddenlink, and Time Warner Cable — whose access networks represent about 50% of all U.S. broadband subscribers.We first discuss the problem of interconnection and utilization at interconnection points. We then discuss the basic operation of the measurement capabilities, including the collection and aggregation of traffic flow statistics (i.e., IPFIX records), providing an assessment of the scenarios where these aggregate measurements can yield accurate conclusions, as well as caveats associated with their collection. We assess the capabilities of flow statistics for measuring utilization, and we discuss the capabilities and limitations of the approach the aggregation techniques that the ISPs use both in providing data to us, and that we apply before making the data public.The dataset includes about 97% of the paid peering, settlement-free peering, and ISP-paid transit links of each of the participating ISPs. Initial analysis of the data — which comprises more than 1,000 link groups, representing the diverse and substitutable available routes — suggests that many interconnects have significant spare capacity, that this spare capacity exists both across ISPs in each region and in aggregate for any individual ISP, and that the aggregate utilization across interconnects is roughly 50% during peak periods.
- Single Report
- 10.1571/cx9-21-06cc
- Sep 1, 2006
Our customer self-service test drives are based on customer scenarios. In this report, we delve into an HDTV problem resolution scenario, comparing the Comcast search experience to those of Charter Communications, Comcast, Cox Communications, Time Warner Cable, and Verizon. Overall, the search experience was poor. None of the vendors offer any capabilities to refine the results or navigate the results by type of information or other attribute. None attempted to guide me through a resolution process. None made use of the context of my search or navigation actions. And, although none of the sites had the content we sought, two of the companies, Time Warner Cable and Verizon, made it difficult to discover that further search would be fruitless.
- Research Article
7
- 10.1177/019372358000400102
- Mar 1, 1980
- Journal of Sport and Social Issues
Millions of dollars and thousands of hours of time were devoted to television coverage of the 1976 Olympic Games in Montreal. The objective of this coverage was not the repor ting of the details of each sporting evènt. Rather, the television outlets, particularly America's ABC Network, actually devoted more time to promoting their own programming and to covering the events (restaurant tours) peripheral to the sporting activity.
- Research Article
2
- 10.6017/eurj.v11i1.8818
- Apr 1, 2015
- Elements
In 2014, a merger proposal was submitted to the U.S. Department of Justice by Comcast Corporation and Time Warner Cable. The proposed merger has incited popular opposition due to concerns that it would lead to the applicants’ monopolistic control over the internet distribution market. Consequently, appeals have been made in favor of enforcement of antitrust laws in this case. While it may be symptomatic of a need for the laws to change, there is currently no legal foundation for the opposition, and it would be fully legal for the Department of Justice to support the merger. In any case, either verdict will lead to a profound legal precedent.
- Research Article
- 10.2139/ssrn.2600715
- May 1, 2015
- SSRN Electronic Journal
This paper is one of the major economic studies I submitted to the FCC in opposition to the proposed merger of Comcast and Time Warner Cable. After reviewing and rebutting some of the economic theories and evidence put forth by Comcast in support of the merger the paper presents an antitrust analysis that shows that the merger would have significant horizontal and vertical effects that would harm the public.
- Research Article
- 10.1353/cj.2016.0036
- Jan 1, 2016
- Cinema Journal
Editors’ Introduction Ross P. Garner (bio) and Karra Shimabukuro (bio) Twin Peaks (1990–1991) debuted on the ABC network on April 8, 1990. The pilot episode, which was directed by David Lynch and cowritten by Lynch and Mark Frost, garnered the highest viewing figures for a TV movie for the 1989–1990 season, and the series quickly became a cultural phenomenon of the early 1990s. The show was infamously received by critics as “the series that will change TV” and actively promoted under similar terms by ABC, while also sparking a national demand for cherry pie and coffee and raising many of its offbeat characters (such as the Log Lady, played by Catherine E. Coulson), and the stars who played them, to widespread recognition.1 Outside of the United States, Twin Peaks also attracted a small but dedicated following in many of the countries where it became distributed, including the United Kingdom, Denmark, Finland, [End Page 118] and Australia.2 Yet Twin Peaks ultimately ran for just thirty episodes, succumbing to cancelation in June 1991 as a result of poor domestic ratings. This was despite seemingly having set itself up for a third series when it ended on an unresolved (and heartbreaking) cliff-hanger in which lead character, and continual beacon of purity, Special Agent Dale Cooper (Kyle MacLachlan) had become possessed by the murderous evil spirit BOB (Frank Silva) as a result of his journey into the otherworldly Black Lodge.3 In 1992, a prequel movie, Twin Peaks: Fire Walk with Me (David Lynch), was released to negativity (unfair, in the view of these editors) and seemed to signal the end. Lynch carried on directing unnerving cinematic masterpieces, Frost continued to work as a screenwriter for both film and television, and members of the cast had varying degrees of visibility and success in the screen industries.4 Then, unexpectedly, October 6, 2014, brought the announcement that Twin Peaks would return as a limited episode series for the premium-rate subscription cable network Showtime. Although the show has gone through a tumultuous preproduction phase during which Lynch departed from, and then returned to, the revival, this period has demonstrated two noteworthy points. First, the nostalgia for the show was signified by the cast-produced video “Twin Peaks without David Lynch is like …,” which generated many shares and reactions from fans across different digital platforms (a point Dana Och also alludes to at the start of her essay). Second, elements such as the additional “No Lynch, No Peaks” campaign and the Official Twin Peaks cast-run site on Facebook indicate the continued centrality of Lynch-as-auteur to the show in both production and fan interpretive communities. However, with these behind-the-scenes issues resolved, it seems certain that audiences will soon be revisiting the Twin Peaks inhabitants among the branches that blow in the breeze. The years between the cancelation of Twin Peaks and its revival have seen it build and maintain a dedicated fan community through a variety of practices.5 These have included early fanzines (Wrapped in Plastic), long-running conventions on both sides of the Atlantic (e.g., Twin Peaks Festival and the Twin Peaks UK Festival), and, as Rebecca Williams discusses here, social media forms.6 Twin Peaks has also remained a highly visible program within the academic study of television. Although this has partly occurred because of an ongoing interest in Lynch’s oeuvre and perspectives indebted to differing inflections of auteur criticism within film studies, the show has also accrued a pivotal position in TV studies debates.7 Although postmodernist readings of the series have waned, the program’s status as a point of reference in analyses [End Page 119] of both quality and cult forms continues, and its reputation has been enshrined among television scholars and beyond.8 Recognizing these trajectories, this In Focus section uses Twin Peaks to examine wider issues regarding how the legacy of an iconic TV program becomes constructed. In considering this, two areas of focus arise. First, an ongoing interest in the text of Twin Peaks is demonstrated as scholars return to the series from emergent or hitherto-overlooked perspectives to provide new insights. Karra Shimabukuro begins this...
- Conference Article
- 10.1109/cn.1995.509585
- Jun 20, 1995
Time Warner cable's full service network-program management of the FSN virtual organization
- Single Report
17
- 10.17487/rfc7021
- Sep 1, 2013
NAT444 is an IPv4 extension technology being considered by Service Providers as a means to continue offering IPv4 service to customers while transitioning to IPv6. This technology adds an extra Carrier-Grade NAT (CGN) in the Service Provider network, often resulting in two NATs. CableLabs, Time Warner Cable, and Rogers Communications independently tested the impacts of NAT444 on many popular Internet services using a variety of test scenarios, network topologies, and vendor equipment. This document identifies areas where adding a second layer of NAT disrupts the communication channel for common Internet applications. This document was updated to include the Dual-Stack Lite (DS-Lite) impacts also. Status of This Memo This document is not an Internet Standards Track specification; it is published for informational purposes. This is a contribution to the RFC Series, independently of any other RFC stream. The RFC Editor has chosen to publish this document at its discretion and makes no statement about its value for implementation or deployment. Documents approved for publication by the RFC Editor are not a candidate for any level of Internet Standard; see Section 2 of RFC 5741. Information about the current status of this document, any errata, and how to provide feedback on it may be obtained at
- Research Article
- 10.2139/ssrn.2137231
- Aug 28, 2012
- SSRN Electronic Journal
In December 2011, Verizon Wireless, the country’s largest wireless provider, announced agreements with some of the country’s largest cable operators — Comcast Corp., Time Warner Cable Inc., Cox Communications, and Bright House Networks — to acquire the AWS-band wireless spectrum licenses that the cable companies had acquired at auction in 2006. The transactions would result in Verizon Wireless acquiring either 20 or 30 megahertz of spectrum in local markets covering approximately 94% of the U.S. population. According to the smaller national wireless carriers, the “transactions will . . . eliminate from the market one of the two remaining large available bands of quality spectrum, which other carriers could rapidly deploy to broaden coverage and enhance competition,” and “the only sizable allocated but unused block of spectrum that would be suitable for 4G deployment.”
- Research Article
2
- 10.30707/jste48.2sticker
- Jan 1, 2011
- Journal of STEM Teacher Education
This study was conducted to describe a high school engineering curriculum, identify teaching strategies used to increase math and science literacy, and discover challenges and constraints that occur during its development and delivery, as well as what strategies are used to overcome these obstacles. Semi-structured interviews were conducted with the engineering instructor. In addition, students were observed and curriculum documents, teacher lesson plans, and teacher resources were examined. Concepts created the platform for delivery, curricular trial and error was at work, science and engineering competitions were leveraged as a basis for learning activities, and project based learning and teaching was critical. There was a clear emphasis on creative thought and work. Assessment of student learning was dubious and elusive and stakeholders tended to be uneasy with this new pedegogy. Financial and instructional support through business partnership and administrative support were found to be critical strategies used to overcome obsticles identified. David Stricker is an Assistant Professor at University of Wisconsin-Stout. He can be reached at strickerd@uwstout.edu 64 JOURNAL OF STEM TEACHER EDUCATION A Case Study: Teaching Engineering Concepts in Science The focus on improving science, technology, engineering, and mathematics (STEM) education for America’s children can be traced back to the days of Sputnik and beyond. However, compared with advancements then, it has been argued that today technological development and industrial growth are increasing at an exponential rate with expanding global application (Brophy, Klein, Portsmore, Rogers, 2008). Consequently, amid concerns that the United States may not be able to compete with other nations in the future due to insufficient investment today in science and technology research and STEM education, funding initiatives such as the American Recovery and Reinvestment Act (U.S. Department of Education, The American Recovery and Reinvestment Act of 2009: Saving and Creating Jobs and Reforming Education) and “Race to the Top” competitive grants have been enacted in 2009 in an effort to offer substantial federal support for such initiatives (U.S. Department of Education. President Obama, U.S. Secretary of Education Duncan Announce National Competition to Advance School Reform). The support structure for STEM education does not end with tax dollars. Large private companies such as Time Warner Cable have committed $100 million in media time, and the MacArthur Foundation is supporting “National Lab Day” that will include, among other initiatives, a year-long effort to expand hands-on learning methods throughout the country. Specifically, within the STEM focus, engineering education supports the attainment of a wide range of knowledge and skills associated with comprehending and using STEM knowledge to achieve real world problem solving through design, troubleshooting, and analysis activities (Brophy, et. al., 2008). The arguments for including A Case Study: Teaching Engineering Concepts 65 engineering education into the general education curriculum are well established. Some are motivated by concerns regarding the quantity, quality, and diversity of future engineering talent (American Society for Engineering Education, 1987; National Academy of Engineering, 2005; National Research Council, 1996; International Technology Education Association, 2002) and others by the basic need for all students, in their pursuit of preparing for life, work, and citizenship in a society inundated with technology, to possess a fundamental understanding of the nature of engineering (Welty, 2008). In an attempt to address this issue, there have been a number of curricula designed to infuse engineering content into technology education courses (Dearing & Daugherty, 2004). Each of these programs proposes teaching engineering concepts or engineering design in technology education as a vehicle to address the standards for technological literacy (International Technology Education Association, 2000/2002). Similarly, the National Academy of Engineering (NAE) publication Technically Speaking (Pearson and Young, 2002) emphasizes the need for all people to become technologically literate to function in the modern world. However, despite this clear need, within the technology education profession itself, the appropriate engineering curriculum required for implementation, particularly at the high school level, remains unclear. Indeed, engineering curricula exist that have been designed for implemetation, not in technology education, but rather in math and science classrooms. As a result of the choices available to teachers and school administrators, the extent to which the most effective way of delivereing engineering content to high school students remains unclear. 66 JOURNAL OF STEM TEACHER EDUCATION
- Research Article
18
- 10.1109/2.384111
- May 1, 1995
- Computer
Developers at Time Warner Cable's Full Service Network in Orlando are quickly learning about the challenges of sending megabytes of video information across miles of fibre optic cables. Although the pilot project's video- and shopping-on-demand are up and running, the software developers must now determine how to send different news-on-demand requests to 4,000 homes by the end of the year.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
- News Article
- 10.1016/s1464-2859(10)70346-1
- Dec 1, 2010
- Fuel Cells Bulletin
EnerSys fuel cell backup power for Time Warner in California
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.