Friday 7 January 2022

QUANTITATIVE METHODS IIBMS EXAM ANSWER SHEETS PROVIDED WHATSAPP 91 9924764558

QUANTITATIVE METHODS IIBMS EXAM ANSWER SHEETS PROVIDED WHATSAPP 91 9924764558

CONTACT

DR. PRASANTH BE MBA PH.D. MOBILE / WHATSAPP: +91 9924764558 OR +91 9447965521 EMAIL: prasanththampi1975@gmail.com WEBSITE: www.casestudyandprojectreports.com


 The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

1

Attempt Only 4 Case Study

Case I -Morgan Stanley’s Return on System Non-investment

Morgan Stanley is global financial services firm with more than 600 offices in 30 countries and over 53,000

employees. It was founded in 1935 and is headquartered in New York City. The firm operates in four segments:

Institutional Securities, Asset Management, Retails Brokerage, and Discover (which provides Discover Card services.) The

firm acquired the Discover Card business as a result of its merger with retails brokerage dean Witter discover and Co. in

1997.The unification of Morgan Stanley and Dean Witter created a digital, cultural, and philosophical divide, which was

extremely difficult to overcome. One of the business sectors to suffer the most under this arrangement has been Retail

Brokerage, which manages $616 billion in client assets. Retail Brokerage provides comprehensive brokerage, investment,

and financial services to individual investors globally, with 9,526 worldwide representatives in more than 500 retail

locations, including 485 in the United States.

Despite the merger, the Retail Brokerage group was never accepted as an equal partner by the rest of Morgan

Stanley. Former Dean Witter employees have claimed they felt like disrespected outsiders after the merger. The feeling

persisted and many retail brokers viewed their job security as tenuous at best. Moreover, Retail Brokerage was not wellintegrated

with the rest of the company. It ran on a different systems platform than the institutional brokerage side, and its

employee systems were not integrated.

Retail Brokerage systems were also much more antiquated than those at other parts of the company.Brokers have

to visit their offices on weekends to print portfolio summaries in advance of client meetings, because the outdated

computer systems could not handle the task during normal business hours. Even on those off-hours, desktop PCs, which

hadn't been upgraded in years, would often crash and printers clogged if they were being used by more than two people.

Brokers did their work without benefit of an application that provided both real-time stock quotes and transaction

histories. Some of the firm's technology problems couldn't be hidden from clients, who routinely complained about the

customer Web site and sparsely detailed year-end tax reports they received.

Top brokers started to leave, taking with them the portfolios of numerous important clients. Profits specifically

from Retail Brokerage dropped precipitously and margins lagged behind those of comparable brokerage firms. During this

time, nearly 1,500 brokers left the company. Bill Doyle, an analyst with Forrester Research Inc., pointed out that the

business was ailing partially as a result of lack of investment in technology. When the stock market crashed in 2001, CEO

Philip Purcell believed that the market's comeback would happen slowly. He therefore focused his business strategy on

maximizing profits instead of generating revenue. The implementation of this strategy involved cutting costs. Each of

Morgan Stanley's divisions received less funding for their operations, jobs were eliminated, and investing in technology

was obviously a low priority. Purcell, of course, had miscalculated. The market rebounded within a few years, and Morgan

Stanley was not positioned to compete in retail. While his firm was watching its margins, Merrill Lynch was spending $1

billion on new systems for its brokers. The turmoil in the inner sanctum of Morgan Stanley's leadership also contributed to

the company's woes.

Purcell locked horns with investors, executives, and former executives over a number of issues, one of which was

selling the underperforming Discover credit card division. Some investors even wanted Purcell to spin off the entire Dean

Witter part of the company. In March 2005, eight former executives appealed to Morgan Stanley's board of directors to

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

2

remove Purcell as CEO for his mismanagement of Discover and Retail Brokerage. The board determined that the best

interest of the firm was served by keeping Purcell and divesting itself of its struggling divisions. The board also approved

Purcell's appointments of two executives who were considered loyal to him and to the board.

Protesting Purcell's leadership, two leading executive sin the investment banking division resigned. More

departures followed. Purcell's critics now had even more ammunition with which to bring him down: in addition to

mismanaging the struggling areas of the business, his action has threatened the performance of the firm's strength,

investment banking. Purcell finally resigned in June 2005, unable to shake the claims that his solutions to problems were

lightweight rather than dramatic and far-reaching, and that his decision were based on protection his job rather than

improving the firm. He was succeeded by John Mack, a former Morgan Stanley president who had left the company in

2001. As a result of a power struggle with Purcell.

With new leadership in place, Morgan Stanley has finally begun to address the issue of technology in its Retail

Brokerage division, which has been renamed the Global Wealth Management group. In October 2005, the firm hired Eileen

Murray as its head of Global Operation and Technology. She works directly under Chief Executive John Mack, with whom

she has a strong professional history. Murray has committed to boosting Morgan Stanley's investment in technology for

retail, saying, "We expect to make substantial improvements" that " will ultimately help our financial advisors better serve

our clients while also helping our clients better manage their relationship with us". As proof, the technology and

operations budget for the Global Wealth Management Group for 2006 exceeded $500 million. Mack also brought in a new

boss for the group. It is now under the leadership of James Gorman, who performed a successful parallel makeover at

Merrill Lynch's brokerage division.

Mack has been under some pressure to sell the retail division, a choice he has been reluctant to make. He

subscribes to the view that ownership of a retail brokerage business is an investment in the firm because, in addition to

providing revenue from individual investors, it gives Morgan Stanley a direct channel for selling its own investment

banking products. Mack's goal is to increase the profit margin of the Global Wealth Management Group retail brokerage,

which ranges from 11 percent to 20 percent, which would make it as profitable as rivals' businesses. Mack has stated both

publicity and privately that some of Morgan Stanley's businesses had not received the technology they needed, and he

intends to make the necessary investments. In the firm's 2005 annual report, Mack said, "We are committed to addressing

underinvestment," and "We're going to upgrade our technology platforms and provide our financial advisors and

investment representatives with a tool kit that is an competitive as that of our leading peers."

Some of the overwhelmed broker desktop workstations have been replaced. The new systems are better

integrated with backend systems so that brokers have been replaced. The new systems are better integrated with backend

systems so that brokers have a better view of client portfolios. The company plans further improvements in their are so

that brokers will have access to all relevant client data at once including transaction history, contact history, and portfolio

performance. Consolidating all of these features will require several years of work. The company also rolled out a new taxreporting

application that automatically reconciles gains and losses and allows users to download information from its

client Web site into popular tax programs. Before that time, customers had to wade thought a confusing maze of figures to

add up gains and losses on their year-end tax reports.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

3

In response to customer demands, Morgan Stanley scheduled an upgrade of its Web site for May 2006, which

analyst Doyle described as a particularly weak area for the firm. The services available online to Morgan Stanley customers

dated back to pre-2000 technology. Doyle sees the Web presence as a major challenge because Morgan Stanley has been

focusing more on its wealthiest clients than on the rank-and-file small investors. The firm had previously assumed that

these top clients weren't interested in online services because they get the direct attention of brokers (whereas investors

with portfolios under $100,000 must deal with call centers). Research by Forrester has shown the opposite to be true:

wealthy customers actually want more hands - on control of their portfolios, and therefore want more tools and services

available online. These customers prefer to approach their brokers with their own ideas. Gorman, as head of the

significance that online technology holds for his division.

Mack and Gorman must also take measures to repair the schism that developed after the merger with Dean

Witter. Mack has been addressing the issue of a "one-firm culture." The firm is trying to stem the loss of productive

brokers. Increasing salaries and expense accounts are not enough. The top brokers still fell they can fulfill their earning

potential better and hold hobs longer at other firms. It's not just that their print queue gets jammed; it's that they question

how much the company values them if it's not willing to support them in such a way that they can best perform their jobs.

By the spring of 2006, sings of progress were evident. In June 2006, Morgan Stanley generated second-quarter net

income of $1.96 billion. The retail brokerage division posted $157 million in pretax profit, the largest profit since the first

quarter of 2005.

CASE I QUESTIONS:

1. Why did Morgan Stanley under invest in information technology?

2. Why was the merger with Dean Witter disruptive for the company?

3. If you were James Gorman, the new head of Global Wealth Management Group, What information systems would

you invest in? Why? Do you think Morgan Stanley's plans for an integrated client information system are

worthwhile? [Hint: Think of the services you would like to receive from your banker or stock broker.]

4. Aside from new systems, what changes in management and organization are required to restore revenue and

profit growth at the Global Wealth Management Group?

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

4

CASE II:

If you turn on the television, read a newspaper, or surf the Web, you're bound to find many dire predictions about

large-scale loss of life from biological or chemical attacks or an avian influenza pandemic. Computer models estimate that

between 2 and 100 million people could die in the event of a flu pandemic, depending on the characteristics of the virus.

Fears of a major public health crisis are greater now than ever before, and governments throughout the world are trying to

improve their capabilities for identifying biochemical attacks or pandemic outbreaks more rapidly.

On May 3, 2006, the United States government issued an Implementation Plan for its National Strategy for

Pandemic Influenza to improve coordination among federal, state, and local authorities and the private sector for

pandemics and other public health emergencies. The implementation plan calls for improving mechanisms for real-time

clinical surveillance in acute care settings such as hospital emergency rooms, intensive care units, and laboratories to

provide local, state, and federal public health officials with continuous awareness of the profile of illness in communities.

One such initiative is the BioSense Real-Time Clinical Connections Program developed by the U.S. Federal Centers

for Disease Control and Prevention (CDC). BioSense sits atop a hospital's existing information systems, continually

gathering and analyzing their data in real time. Custom software developed by CDC monitors the facility's network traffic

and captures relevant patient records, diagnoses, and prescription information. The data include patient age, sex, ZIP code

of residence, ZIP code of the medical facility handling the patient, the principal medical complaint, symptoms, onset of

illness, diagnoses, medical procedures, medications prescribed, and laboratory results. The software converts these data to

the HL7 data messaging format, which is the standard for the health-care industry, encrypts the data, and transmits them

every 15 minutes over the Web to the CDC where they are maintained in a large data repository.

The system summarizes and presents analytical results by source, day, and syndrome for each ZIP code, state, and

metropolitan area using maps, graphs, and tables. Registered state and local public health agencies as well as hospitals and

health care providers are allowed to access data that pertain to their jurisdiction. They access BioSense via a Web-based

application over a secure data network, 'information from BioSense could show early signs of a pandemic or biologic

attack and alert local Hospitals, health workers, and federal and state agencies to take preventive measures. The

traditional process for public health Surveillance is manual and much slower. Hospitals, Physicians, and laboratories

would mail or fax paper reports to public health agencies, who would then call health care providers for more detailed

information. This slow chain of person-to-person communication is not well-suited to a major public health emergency.

By monitoring streaming data about health events as they occur, the system helps CDC epidemiologists quickly

detect early signs of a flu pandemic or bioterrorist attack and provide public health and government decision makers with

the information needed to manage preparedness and response, simultaneous access of the data by all levels of public

health decreases the time needed to classify health events as serious public health problems; decreases the time to identify

causes, risk factors, and appropriate interventions; and decreases the time needed to classify health events as serious

public health problems; decreases the time to identify causes, risk factors, and appropriate interventions; and decreases

the time needed to implement countermeasures and health guidance.

BioSense first became operational in 2004, when it began gathering daily data from U.S. Defense Department and

Veterans Affairs (VA) hospitals and laboratory Corporation of America (LabCorp) orders Sir medical tests. (LabCorp

operates a large nationwide network of testing locations and service Enters and is one of the largest clinical lab service

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

5

providers in the United States.) Approximately 700 defense Department and 1,110 VA facilities report data to BioSense. In

late 2005, CDC started to expand fie BioSense network to civilian hospitals in major metropolitan areas and anticipates

sharing its Analysis of local and regional influenza-like illness tenders with health care and other public agencies in

affected areas. The CDC expects to connect 300 hospitals to BioSense by the end of 2006.To help civilian hospital link to

BioSense, the CDC enlisted the Consella group health care information technology consultants. Consella explains the

benefits of participating in a project that will serve their specific interests as well as those of the public at large and will put

their data in standardized form.

However, many hospitals have not been anxious to jump on the bandwagon because the transition would be burdensome

and time-consuming. To transmit data to BioSense, each hospital must standardize its patients and other medical data.

Most hospitals use their own coding systems for symptoms, diseases, and medications. CDC’s contractors would have to

work with the hospital to translate its data codes into the standards used by CDC’s software. According to Barry Rhodes,

CDC’s associate director for technology and informatics, “To standardize the data and do all the data validation steps is a

huge technological challenge."

Some in the medical community question whether the BioSense network is worth the effort. whether the BioSense

network is worth the effort. "If there is a pandemic flu, we are not going to know about if from a system like this," says Dr.

Susan Fernyak, director of communicable disease control and prevention at the San Francisco Department of Public

Health. According to Dr. John Rosenberg, director of the Infectious Disease Laboratory at the State of California's

Department of Health Services in Richmond, California, if an epidemic broke out, "You'd know it befor the date rolled in.

When your emergency rooms fill up you make a phone call; this is probably a better measure."

David Groves, CDC project head at SAIC, a BioSense contractor, points out that a hospital's medical staff might not

know right away that there's a serious problem when patients start showing up with symptoms. CDC scientists using the

system will be in a better position to spot a major pandemic or biological or chemical attack over a wider geographic area.

Having a bigger picture of what's happening will help CDC help hospitals, police, and emergency units mobilize a better

response.

Although participation in BioSense is voluntary, physicians and health official might resent the system because it

enables the federal government to encroach on what has traditionally been the domain of local health care providers and

organization. They note that they and not the CDC have the responsibility for responding to and managing a pandemic.

Additionally, hospitals are reluctant to sign up because of concerns about maintaining privacy and security of patient

information. BioSense would let the CDC "listen in" on their treatment of patients on a real-time basis. The CDC does not

use any data that would identify individual patients.

CASE II QUESTIONS

1. Describe and diagram the existing process for reporting and identifying major public health problems, such as a

flu pandemic.

2. How does BioSense improve this process? Diagram the process for reporting and identifying public health

problems using BioSense.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

6

3. Discuss the pros and cons of adopting BioSense for public health surveillance. Should all hospitals and public

health agencies switch to BioSense? Why or why not?

4. Put yourself in the role of hospital director at a large urban hospital. Would you support joining up with the

BioSense system? Why or why not? What factors would you want to take into account before joining?

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

7

CASE III BLOCKBUSTER vs. NETFLIX: WHICH WILL WIN OUT?

When Blockbuster entered the video rental business in 1985, the industry consisted mostly of independent, momand-

pop-style stores whose entire reach may have been two towns or a few city blocks. In its first 20 years of business, the

rental giant opened 9, 100 stores in 25 countries, gaining a market share that has been enjoyed by few companies in any

industry.Blockbuster equipped each of its video rental stores with custom software it had designed to simplify rental and

sale transactions. An automated point-of-sale system uses a laser bar code scanner to read data from items being rented or

sold and from a Blockbuster customer's identification card. These data are transmitted to Blockbuster's corporate

computer center. Management uses these data to monitor sales and to analyze the demographics, and rental and sales

patterns for each store to improve its marketing decisions.

Blockbuster's success was based on video tape rentals and sales and rentals of DVDs. By 2004, Blockbuster

possessed a 40-percent share of the U.S. video rental market, estimated to range from $7 billion of business per year to $9

billion; Blockbuster also had video sales of around $16 billion.The greatest threat to Blockbuster's viability came from the

emergence of a new business model in the video rental market. Launched in 1998, Netflix Inc. intended to cater to those

video rental customers who valued convenience above all else. First, the upstart eliminated the need for a physical store.

All interactions between Netflix and its customers took place on the Internet and through the postal service. Users could go

online and create a wish list of movies they wanted to rent. For a monthly service fee, Netflix mailed up to three movies at

a time, which the customer could keep for a long as he or she wanted without incurring late charges. When finished with a

movie, the customer mailed it back to Netflix in reshaped packaging provided by the company. Returning a movie

prompted Netflix to send the next title on the customer's wish list. For $19.95 a month, Netflix customers had access to

thousands of movie titles without leaving their homes.

According to Kagan Research LLC, revenues from online movie rentals, which were basically nonexistent in 1998,

rose to $552 million in 2004. Kagan projected that the total revenue would approach $1 billion in 2005 and $3 billion by

2009. As Netflix caught on and its subscription model became popular, Netflix's gains in market share, from 2 to 7 percent

between 2003 and 2004, gave Blockbuster true cause for concern.

To compete in the changing marketplace, Blockbuster made some dramatic changes in its business beginning in

2003. It added an online rental service; Movie Pass, a monthly subscription service for in-store customers; Game Pass, a

subscription service for video games; a trading service for movies and games; and the infamous "No More Late Fees"

program. The entire question of how to address a new source of competition was a complicated matter. Blockbuster could

have chosen to launch an online rental store similar to Netflix and leave it at that. Or, the company could have focused only

on its traditional business in an attempt to lure customers back from the rising online tide. Instead, with the initiatives

previously mentioned, Blockbuster tried to do both.

Blockbuster's $100 million increase in capital expenditures from 2003 to 2004 hints at the scale of the

restructuring of the business. Many of those millions found their way to the information technology department, which

took Netflix on directly by establishing the information systems supporting Blockbuster's own online subscription service.

This venture required Blockbuster to construct a new business model within its existing operations.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

8

Rather than meld the two channels, Blockbuster created a new online division with its own offices near corporate

headquarters in Dallas. Part of Blockbuster's initial strategy for defeating the competition was to undercut Netflix in both

pricing and distribution. Blockbuster set the price for its three-movies-at-a-time monthly subscription at $19.99, which

was, at the time, two dollars less than Nexflix's competing plan. Blockbuster had a strategic advantage in distribution as

well. Netflix was serving its customers from 35 distribution centers around the country. Blockbuster had 30 such facilities

but also had 4,500 stores in the United States to deliver DVDs to most of its customers in only a day or two at lower

shipping costs. Blockbuster also enticed online customers to maintain a relationship with the physical stores by offering

coupons for free in store rentals. Blockbuster's original intent was to integrate the online and in store services so that

customers could float back and forth between the two channels wit restrictions. However, the disparate requirements for

revenue recognition and inventory management have so far been too complex to make the plan a reality.

After a year in existence, the report card on Blockbuster's online store was mixed. The service had acquire on

million subscribers and the company hoped to double that number within seven months or so. At the same time, Netflix

had surpassed three million subscribers and was on its way to four million by the end of the year. Blockbuster continued

to pursuer gains through pricing, at one point lowering its three-movie plan to $14.99 per month versus $17.99 at Netflix.

Both companies offer plan variations such as unlimited rentals of one DVD at a time for $5.99 per month and two at a time

with a limit of 4 per month for $11.99.

In September 2005, research firm SG Cowen declared that Blockbuster's online DVD rental service "remains

inferior" to Netflix. The researcher stated that Blockbuster had improved on movie availability but actually fell further

behind in ratings of its use interface. The evaluation by SG Cowen came on the heels of rocky financial reports for

Blockbuster. Blockbuster's most costly change was likely the "No More Late Fees" campaign it launched in January 2005.

The goal of the program was to lure more customers and position Blockbuster better in the market alongside Netflix,

which never charged late fees. However, the program may have created more problems than it solved. Blockbuster did

measure an increase in in-store rentals after eliminating late fees, but early returns did not suggest that the increase offset

the $250 million to $300 million in annual late fee revenue that was no longer being collected.Well-known corporate

raider Carl Icahn took advantage of Blockbuster's low share price and acquired 9 percent of the company, entitling him a

position on the board of directors. Icahn harshly criticized CEO John Antico's business strategy. Icahn believed that

Blockbuster's new initiatives such as online rentals, were too expensive and too risky. He believed that the company

should take advantage of its prevailing position in the bricks and mortar rental industry, even if that industry were slowly

dying. Despite the presence of Icahn, Antico maintained that online rentals were the only segment of the industry open to

growth.

Both Blockbuster and Netflix now face a new set of challenges. Fifteen million cable subscribers use video-ondemand

(VOD) technology to watch movies and programs that are not yet available on DVD. TiVo and similar digital video

recorders combined with VOD could make the rental of movies obsolete. Some analysts still insist that the economics do

not make sense for movie studios to abandon DVD sales which account for 50 percent of their profits, in favor of VOD. And

technology does not currently permit the bandwidth for VOD suppliers to provide nearly the number of titles that

Blockbuster can. Down the road, however Blockbuster likely will have to address VOE), especially if the studios can

eliminate companies like Blockbuster as an intermediary.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

9

In April 2006, the Internet as a channel for movie distribution finally came into focus. Six movie studios, including

Warner Brothers, Sony Pictures, Universal, MGM, and Paramount, reached an agreement with Web site Movielink to sell

movies online via download. Until that time, Movielink had offered movie downloads as rentals, which, like the VOD model

the customer could watch for only 24 hours. Sony, MGM, and Lions Gate also reached agreements with a Movielink

competitor, CinemaNow, which is partially owned by Lions Gate. Warner Brothers also expanded its presence by-entering

into relationships with Guba.com and BitTorrent. The studios moved to build on the momentum created by the success of

the iTunes music store, which demonstrated that consumers were very willing to pay for legal digital downloads of

copyrighted material. At the same time, they hoped that entering the download sales market would enable them to

confront the piracy issue in their industry earlier in its development than the music industry was able to do.While the

studios' commitment to these ventures appeared clear, what remained a question was whether they could replicate the

success of iTunes. The initial pricing schemes certainty did not offer the same appeal as Apple's $0.99 per song or $9.99

per CD. Movielink set the price for new movies at $20 to $30. Older movies were discounted to $10. Movielink was

counting on the fact that customers would pay more for the immediacy of downloading a movie in their homes, as opposed

to visiting a bricks-and-mortar store like Circuit City or an online store such as Amazon.com, both of which sell new DVDs

for less than $15.00.However, even if customers were willing to pay a little extra, they were getting less for their money.

Most movie downloads did not come with the extra features that are common with DVD releases.

Moreover, the downloaded movies were programmed for convenient viewing on computer screens, but

transporting them from the computer to the TV screen involved a more complicated process than most consumers were

willing to tackle. Neither Movielink nor CinemaNow offered a movie format that could be burned to a DVD and played on a

regular DVD player. In fact, CinemaNow downloads were limited to use on a single computer. To watch these movies on a

television screen, users would need to have Windows Media Center, which is designed to connect to a TV or special jacks

and cables.An additional obstacle for both the technology and the consumer to overcome was bandwidth. Even using a

broadband Internet connection, high-quality movie files, which generally surpassed 1 gigabyte in file size, required in the

neighborhood of 90 minutes to download completely.

Considering these issues, the near-term outlook for the legal digital distribution of movies remains cloudy,

Movielink, with only 75,000 downloads per month, was struggling to sustain itself. Neither Blockbuster nor Netflix seemed

in a panic to adjust to this new source of competition. While locked in legal battles over patents and antitrust concerns, the

two companies had few specific plans related to downloading, though Netflix was widely believed to be considering a settop

box. Netflix said only that downloading was part of its future plans, but expressed dissatisfaction with the terms the

movie studios were offering in early discussions.

The one development that has the potential to force the hands of Blockbuster and Netflix is the entrance of Apple

into the movie download market. Apple's iTunes store, like Netflix, already had a satisfied and loyal customer base, not to

mention a pervasive "cool" factor. And, it was iTunes's successful transition from music-only to music and television

downloads that paved the way for Movielink and CinemaNow to sell movie downloads in the first place. Apple is said to be

on the verge of adding movies to its store and would stick to its flat-rate p; icing model. Industry rumors indicated that

Apple CEO Steve Jobs intended to sell downloads of all movies for S9.99. Industry experts characterized Apple's

involvement as a possible "tipping point" for outline movie distribution.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

10

In the meantime, Antico wants Blockbuster to stay very close to the cutting edge of technology in his industry.

Doing so, he believes will enable the company to replace directly any rental revenues lost to new technology. Meanwhile,

add Amazon to the list of competitive threats on which Blockbuster must also keep a careful eye. Amazon.com already

operates an online movie rental service in the United Kingdom. Could there be another player to compete with Blockbuster

and Netflix? Or could a new partnership shake up the industry again?

CASE III QUESTIONS

1. What is Blockbuster's business model? How successful has it been?

2. What industry and technology forces have challenged that business model? What problems have they created?

3. Is Blockbuster developing successful solutions to its problems? Are there other solutions it should have

considered?

4. How successful is Netflix and its business model?

5. Do you think Blockbuster or Netflix will succeed in the future? Explain your answer.

CASE IV IS THE TELEPHONE COMPANY VIOLATING YOUR PRIVACY?

In May 2006, USA Today reported that three of the four major United States landline telecommunications

companies had cooperated with the National Security Agency (NSA) fight against terrorism by turning over records of

billions of phone calls made by Americans. AT&T, Verizon Communications, and BellSouth all contributed to the NSA's

anti-terrorism program. Qwest Communications International was the only one of the big four to withhold its records.The

revelation by USA Today caused a firestorm of controversy. Media outlets, privacy advocates, and critics of the Bush

administration expressed outrage over the program and questioned its legality. The Washington Post referred to the

program as a "massive intrusion on personal privacy."

The issue received particularly strong scrutiny because it came to light only five months after President Bush said

that he had authorized the NSA to listen in on international phone calls of Americans suspected of having ties to terrorism

without obtaining a warrant. When combined, the two stories caused intense worn7 among privacy activists who feared

that a widespread data mining effort was being carried out against American citizens by the administration.

President Bush would not acknowledge the existence of such an initiative. He said only that, "the intelligence

activities I authorized are lawful and have been briefed to appropriate members of Congress." He added, "We are not

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

11

mining or trolling through the personal lives of innocent Americans" and the privacy of citizens was being "fiercely

protected."

What exactly did the phone companies do for the government? After September 11, 2001, they began turning over

tens of millions of phone call records to the NSA, whose goal was to build a database of every call made inside the United

States. The records that were turned over contained only phone numbers and calling information such as time, date, and

the duration of the calls; they omitted names, addresses, and other personal data. Qwest was approached by the NSA at the

same time as the others, but Joseph Nacchio, the company's CEO at the time (later involved in an insider trading scandal),

refused to cooperate. Nacchio based his decision on the fact that the NSA had not secured a warrant or submitted to other

legal processes in requesting the data.

The ethical questions raised by this case prompted no shortage of opinions from executives, politicians, pundits,

activists and legal experts. The phone companies cited a strong belief in protecting the privacy of their customers but

stated that the belief must co-exist with, an obligation to cooperate with law enforcement and the government in matters

of national security. A release from AT & T summed up the company's position as follows: "If and when AT&T is asked to

help, we do so strictly within the law and under the most stringent conditions." Verizon made a similar statement but also

declined to comment on having a connection to a "highly classified" national security plan. The company also indicated

that press coverage of its data dealings contained factual errors.

After examining the issue, legal experts on both sides of it weighed in with their opinions on the actions taken by

the phone companies. Lawmakers began to seek hearings on the matter almost immediately. Customers directed their

anger and concern directly to customer support lines. Two lawyers in New Jersey filed a S5 billion suit against Verizon on

behalf of the public accusing the company of violating privacy laws.Some legal scholars and privacy advocates agree that

the telecoms may have crossed the line. These experts cite the Electronic Privacy Act of 1986, which permits businesses to

turn over calling data to the government only in extreme cases (for example, to protect individuals who are in immediate

danger of being harmed). Creating a database from the records does not meet the criteria. James X. Dempsey of the Center

for Democracy and Technology noted that the law allows for a minimum penalty of $1,000 per customer whose calling

data were submitted to the government. Based on the number of records contributed to the NSA database, the phone

companies faced civil penalties reaching hundreds of millions or possibly billions of dollars.

Dempsey shot down the idea that the phone companies did not break the law because the records they turned

over included only phone numbers and not identifying information. According to Dempsey, the law does not specify that

such personal information needs to be exchanged for the law to be broken. This was a popular position among critics of

the NSA program. They asserted that phone numbers could easily be cross-referenced to personal information, such as

names and addresses, using databases that are readily available to the public on the Internet.

A senior government official who spoke on condition of anonymity admitted that the NSA had access to most

domestic telephone calls even though, according to Kate Martin of the Center for National Security Studies, the NSA would

be prohibited by federal statutes from obtaining such data without judicial consent. The government official said that the

scope of the program was small in the sense that the database was used only to track the communications of individuals

who were known to have ties to terrorism.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

12

The non-profit Electronic Frontier Foundation (EFF), a privacy watchdog, concurs with Martin's assessment. EFF

supports its argument by referencing the Pen Register Statute, which prohibits the government from gathering calling data

without a court order, and the Fourth Amendment, which covers privacy rights and unreasonable search and seizure.

However, the impact of such a defense in court was unclear. In response to the wiretapping controversy of five months

earlier, the Bush administration cited Article II of the Constitution as the derivation of its authority to employ wiretapping

as a terror-fighting tool. Furthermore, Congress virtually wrote the President a blank check by empowering him to "use all

necessary and appropriate force" in the war on terror.It was not surprising that Congress had as much to say about the

issue as anyone. Various senators weighed in both with opinions and calls for investigation. Opinions did not always fall

along party lines.

Senator Dick Durbin, a Democrat from Illinois, believed that actions of the telephone companies put the privacy of

American citizens at stake and that the companies should be compelled to appear before the Senate Judiciary Committee.

Durbin was backed up by the chairman of that committee, Senator Arlen Specter, a Republican from Pennsylvania. Senator

Specter intended to call upon executives from the Participating companies to give their testimony about the NSA database

program. House Majority Leader John Boehner of Ohio and Senator Lindsey Graham of South Carolina also crossed party

lines in questioning the necessity of such a program. Senator Graham asked, "The idea of collecting millions of thousands

of phone numbers, how does that fit into following the enemy?"

Proponents of the program answer that question by saying that the purpose of the program is to discover patterns

in the calling records that indicate the presence of terrorist activity. Intelligence analysts and commercial data miners

refer to this as ink analysis, "which is a technique for pulling meaningful patterns out of massive quantities of rata.

Defenders of the program were harshly critical of media outlets who exposed it. Representative Peter Hoekstra, a

republican from Michigan and chairman of the House Intelligence Committee, insisted that reporting on the NSA's

programs undermined national security. He stated, "Rather than allow our intelligence professionals to maintain a laser

focus on the terrorists, we are once again retired in a debate about what our intelligence community may or may not be

doing”. President Bush echoed this sentiment by declaring that leaks of sensitive intelligence always hurt the government's

ability to counter terrorism.

Republican Senator Jeff Sessions of Alabama also disputed the need to investigate the program. Senator Sessions

answered the critics by emphasizing"; that the program did not involve actual surveillance of phone conversations and

therefore did not merit the scrutiny it was receiving. In his statements, the president also went out of his way to

distinguish between eavesdropping on telephone conversations and gathering call data.

In May 2006, senior intelligence officials revealed that the scope of the NSA's eavesdropping operations was

strongly influenced by Vice President Dick Cheney and his office. The Vice President and his key legal adviser, David S.

Addington, began pushing for surveillance of domestic phone calls and e-mails without warrants soon after September

11th. They believed that the Constitution gave the executive branch expansive powers that covered this type of domestic

spying, as well as certain interrogation tactics for dealing with suspected terrorists.

However, the NSA pushed back on advice from its owl legal team. As a result, the NSA limited the eavesdropping to calls in

which at least one participant was outside the United States.

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

13

Still, conducting such operations appeared to conflict with the 1978 Foreign Intelligence Surveillance Act (FISA),

which required court authorization for any wiretapping done within the United Stales. Nancy Libin of the Center for

Democracy and technology posits that listening in on any phone cay without a warrant, regardless of whether it is

domestic or international, is illegal according to FISA. However, while FISA covers wiretapping, it does not clearly prohibit

the type of data mining that was done in the NSA database program.

In June 2006, a federal court in California released a document related to EFF's suit against AT & T that sheds light

on how the phone company may have provided its data to the NSA. In the document, J. Scott Marcus, who had worked as a

senior advisor for Internet technology to the Federal Communications Commission, evaluates evidence presented to EFF

from a former AT & T technician named Mark Klein. Klein claimed that AT&T recon figured its network in San Francisco

and installed special computer systems in a secret room in order to divert and collect Internet traffic for use by the NSA.

Marcus concluded that Klein's description of a private backbone network partitioned from AT & T's main Internet

backbone was "not consistent with normal AT & T practice”. Marcus further observed that at the time of the

reconfiguration, AT & T was in poor shape financially and would have been very unlikely to have made such expensive

infrastructure changes on its own.

In July 2006, Senator Specter announced that an agreement had been reached with the White House to give the

Foreign Intelligence Surveillance Court the authority to review the constitutionality of the NSA's surveillance programs.

The court would be empowered to determine whether wiretapping fell within the president's powers to fight the war on

terrorism. The agreement allowed for the court's proceedings and rulings to be conducted in secret. Even though judicial

oversight of the NSA's activities had been established, debate continued over the efficacy of the compromise. The American

Civil Liberties Union and the ranking democrat on the House Intelligence Committee, Representative Jane Harman of

California, accused Senator Specter of giving away too much, including a key Fourth Amendment protection.

The White House won several important points in the agreement, including the ability to appeal the court's

decisions; changing the language so that submitting a program to the court was actually optional for the administration;

and a guarantee that the agreement doe's not retract any of the president's existing constitutional authority. On the other

hand the lead judge on the court was known to have significant misgivings about the NSA's actions ever before the

program came to light. The bill to enact FISA's power over NSA wiretapping awaits Congressional approval.

CASE IV QUESTIONS

1. Do the increased surveillance power and capability of the U.S. government present an ethical dilemma? Explain

your answer.

2. Apply an ethical analysis to the issue of the U.8 government's use of telecommunications data to fight terrorism.

3. What are the ethical, social, and political issues raised by the U.S. government creating massive databases to

collect the calling data of millions of Americans?

4. What is the responsibility of a business such as AT & T or Verizon in this Matter? What are the ethical, social, and

political issues rose by a business, such as a phone company, working with the government in this fashion?

5. State your opinion of the agreement reached by the White House and the Senate Judiciary Committee with regard

to the NSA wiretapping program. Is this an effective solution?

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

14

CASE V - Merrill Lynch Connects Past and Future Technology

Merrill Lynch is a worldwide leader in-;'financial management and advisory services, employing 50,600 workers

36 countries and territories. The company and its subsidiaries provide brokerage, investment banking, financing, wealth

management, advisory, asset management, insurance, lending, and other related products and services to private,

institutional and government clients with assets of $1.6 trillion'' In 2005, Merrill Lynch posted a record S5.1 billion net

earnings, a 15 percent increase over the previous year, on net revenues of $26 billion.

One of the most critical components of Merrill, Lynch's operations is its information technology infrastructure.

Over the last five years, that IT infrastructure has played a major role in the company gains. Like many financial

institutions, Merrill Lynch has had to modernize its technology infrastructure order to remain competitive.

Merrill Lynch considered its IBM mainframe installation, which was one of the largest in the world, to be a

strategic asset. The mainframe ran in the neighborhood of 23,000 programs to process the firm's 80 million daily online

transactions for accessing customer accounts online or making stock trades.

In modernizing its technology, Merrill Lynch had to make choices regarding its legacy computers and applications.

Internet-based applications that gave customers access to their portfolios and tools to work with them were a key to

remaining competitive. But these applications did not use mainframe-based software. How could Merrill Lynch develop

such applications while leveraging the processing power and wealth of data in its mainframe?

The answer appeared to be Web services and a service-oriented architecture (SOA). Most corporations developing

a SOA typically use commercially available platforms such as those from BEA Systems and webMethods instead of creating

their own development platforms. They rely on the vendor's expertise and access to consultants familiar with integrating

mainframe and Web applications.

Project leader Jim Crew, then head of database infrastructure for Merrill Lynch, determined that on the surface,

purchasing an SOA platform was much easier than building one, and would have enabled the firm to deploy its Web

services relatively quickly However, no SOA vendors that Crew researched offered products that met Crew's requirements

for the project. They were offering SOA platforms that were geared toward distributed programming and recent

development tools such as Java and .NET.

Merrill Lynch's 1200 mainframe programmers did not have experience with these tools. Retraining this huge staff

did not make sense economically, nor did purchasing new workstations required for running the development software.

According to research from Gartner Group consultants. retraining Merrill Lynch's mainframe programmers could have

taken as much as a year and cost more than $80 million. To Crew, it was obvious that the firm should pursue a more

unconventional approach; construct a proprietary Web development platform from the ground up to extend the

capabilities of its legacy mainframe systems.

Merrill Lynch had initially tried to avoid these costs by copying the data stored in its mainframe installation into

Oracle, Sybase, or Microsoft SQL Server databases. In those formats, the data were compatible with server-based

applications. However, that technique was not entirely satisfactory. Copying large quantities of data often introduces

errors based on disk failures and space issues. Furthermore, some data can become obsolete as soon as they are copied.

For instance, a client who made several stock trades would have to wait until the next day to see an accurate balance in his

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

15

or her account. Crew noted that the firm was spending money on copying data that could quickly be out-of-date while the

accurate data were always residing on the mainframe.

Instead, Merrill Lynch created its own set of in-house proprietary tools that enable its mainframe legacy programs

and the functions they perform to be exposed as Web services. XML tags are used to describe the data for other

applications that are equipped to interpret XML. SOAP makes it possible for programs running under different operating

systems to communicate with each other. Together, the two standards made it possible for online applications to

communicate effectively with the mainframe without an additional layer of middleware.

Merrill Lynch's Web services toolset -was called X4ML, which stood for XML for Modernizing Legacy. Crew

challenged his team to increase the firm's savings from Web services ten-fold to $20 million. Crew's team established five

criteria for the Web services project:

1. No new programming languages for the mainframe programmers to learn.

2. No new software tools for development that would require expensive workstations; tools would be accessible

from a Web browser.

3. A central storage directory for the Web services that would be developed so that programmers could easily reuse

and repackage them with each other.

4. Web services developed as a result of the project had to conform to the existing mainframe security standards as

well as Web security standards for encryption, authentication, and authorization.

5. Inclusion of budding Web services standards in the Web services architecture to ensure future availability.

The project team prohibited the new platform from requiring changes to program code on the mainframe or

hindering its operation in any respect. The team did not want to alter the mainframe in any way because of its track

record, its complexity, and the fact that there was likely no one on staff who knew the inner workings of its deep-rooted

code.

To maximize simplicity and speed, the team did not install a middleware server to translate requests made to it in

other languages, such as Java, into instructions that could be understood by the mainframe applications. Instead, the

translation software was written in Assembly Language (a programming language dating to the 1950s that is rarely used

today for business applications) and installed directly on the mainframe This strategy reduced the number of things that

could go wrong during translations and promised better performance.

The lack of middleware meant that the system's users such as Merrill Lynch financial advisers, could "•quest

information directly from the mainframe from their desktops. For example, an adviser could a Web browser to request a

list of all clients who owned shares of certain stock, such as General Electric (GE). The request arrives at the mainframe to

perform a particular operation, and the search is translated by XML.

A Merrill Lynch mainframe programmer can access the X4ML development tool from a desktop Web browser.

Using X4ML, the programmer can create and name a new Web service, import the necessary application from the

mainframe, and then pick and choose which parts of the operation in the legacy application to include in the Web service.

Thus, a programmer is able to produce a Web service that pulls out all of the personal data for a client, or copy the less

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

16

sensitive data, such as name and address. Once a programmer creates a Web service, it is, listed in a Universal Description,

Discovery, and Integration (UDDI) directory, where it can be accessed by other programmers. The X4ML development tool

also includes a testing capability, which enables programmers to correct errors before deploying a service, as well as

utilize trial-and-error to perfect combinations of applications for new services.

Merrill Lynch earmarked $1 billion over a three-year period to use X4ML to provide its 14,000 financial advisers

with a new suite of wealth management applications. For this initiative, the firm teamed with Thomson Financial and

Siebel Systems (now owned by Oracle), which offered financial data and research services and client management

expertise, respectively.

Merrill Lynch's investment in Web services saved the company S41 million in application development costs. The

company wrung even more value out of X4ML by selling it in December 2005 to Web services vendor SOA Software Inc. of

Los Angeles. As part of the deal, Crew and three other key members of the X4ML team shifted their employment to SOA

Software to continue enhancing the tool, which was renamed Service Oriented Legacy Architecture (SOLA). Merrill Lynch

had a long history of selling internally developed technology, and it viewed the sale of X4ML as a way of optimizing its

investment.

Chief Technology Architect Andrew Brown did not think that turning the technology over to another company

would hurt his firm's competitive advantage. He needed six months to convince management that selling to a software

vendor was the right move. After the fact, management appreciated the value of the sale and the space that it created in the

IT budget. At the time of the sale, X4ML was utilizing 600 Web services for 40 different core applications at Merrill Lynch

and processing 1.5 million transactions daily. The price of the X4ML sale to SOA was not disclosed, but SOA Software began

selling SOLA to customers in 2006 for $125,000. Purchasers of the tool were poised to gain unmatched scalability.

Meanwhile, the success of X4ML gave a second life to Merrill Lynch's mainframe programmers and their work.

CASE V QUESTIONS

1. Why did Merrill Lynch need to update its infrastructure?

2. What is the relationship of information technology to Merrill Lynch's business strategy? How was its Web services

initiative related to that strategy?

3. Evaluate Merrill Lynch's approach to Web services development. What are the advantages and disadvantages? Is it

a good solution? Explain your answer.

4. Do you think that Merrill Lynch's decision to sell off its successful technology initiatives was a good idea? Why or

why not?

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

17

Case VI - PANASONIC CREATES A SINGLE VERSION OF THE TRUTH FROM ITS DATA

Panasonic is one of the world's leading electronics manufacturers. It operates under the auspices of parent

company Matsushita Electric Industrial Co. Ltd., a conglomeration of over 600 firms that is based in Kadoma, Japan.

Collectively, the businesses of Matsushita manufacture 15,000 products for a global market and employ 330,000 people

internationally. In Europe alone, Panasonic has 15 sales subsidiaries, 14 manufacturing facilities, five research and

development centers, and seven administrative stations. Add in major presences around the world, including Asia and

North America, and it is clear that Panasonic's operations cover the globe.

With so many different sources of data, the company found itself with product and customer data that were often

inconsistent, duplicate, or incomplete. Different segments of the company used their own pools of data, which were

completely isolated from the data that the rest of the company was using. These conditions combined to be a drag on

operational efficiency and drained significant amounts of money from the corporation as a whole.

The types of data required to launch a new Panasonic product included photos, product specifications and

descriptions, manuals, pricing data, and point-of-sale marketing information. Employees adapted product information to

suit the needs of their country or region. It took considerable time and effort to sift through all the data and create a common

set of data for launching products globally, which allowed competitors to infiltrate markets that Panasonic did not

reach in its first phase of a launch.

To solve this problem, Panasonic decided to pursue a "single version of the truth." Daily activities required the

data to pass though legacy systems, fax machines, e-mail, phone calls, and regular mail. With so man}' people handling the

data in such a variety of formats, inefficiencies and inaccuracies were always a risk. Erasing these problems promised to

increase Panasonic's speed of bringing products to market.

Panasonic was enjoying a number of successes: a market leadership in plasma TVs, a successful transition of

company presidents, and a well-received marketing identity, "Panasonic: Ideas for Life." However, these positives were

overshadowed by the administrative costs incurred by such an immense organization. Thus, when Fumio Otsubo took over

as president in June 2006, he took over a y company with an operating profit margin of only 5 percent. The board of

directors saddled him with the goal of increasing the margin to 10 percent by 2010.

In Panasonic's industry, consumers expect the price of new technology to decrease over time, which it had for

items that were strengths at Panasonic such as plasma TVs and DVD players. Therefore, Otsubo could not expect to

increase the company's profit margin by increasing prices. Instead, he had to set his sights on reducing costs and

increasing sales.

Starting in Europe, Panasonic sought to replace its "pull" model of data dissemination with a "push" model.

Previously, employees in marketing and sales had to request data from numerous repositories. Under the push model, a

centralized data bank sends the information to all employees who need it at the same time, ensuring uniformity. The

recipients of p the data include retail partners and e-commerce vendors, who receive complete product information at all

stages of a product rollout. Panasonic employees receive data on a more targeted basis. The benefits to Panasonic Europe

are more consistent product rollouts and product information. The latter ensures that customers do not become confused

while researching their purchases, which could motivate them to abandon Panasonic for a competitor.The technical force

behind Panasonic Europe's data management overhaul was master-data-management (MDM) software from IBM's

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

18

WebSphere line. The software enabled Panasonic Europe to consolidate date data, as well as systematize the business

processes related to the data. Overall, the company gained better control over its internal data.

Generally speaking, MDM software aims to merge disparate records into one authenticated master file. Many

companies have adopted MDM to fix discrepancies among the databases used by their various departments (e.g., the

accounting department having record of fewer customers than the number of customer IDs in the CRM database). MDM is

particularly useful for companies that have data integration issues as a result of mergers or acquisitions. Small and midsize

firms generally do not have the kinds of challenges that would require an MDM solution.

Implementing MDM is a multi-step process that "includes business process analysis, data assessment, data

cleansing, data consolidation and reconciliation, data migration, and development of a master data is service layer. These

steps produce a system of records that stores the master file for all of the company's data. It is critical for the organization

to institute strict policies against computing activities that could compromise the authenticity of the data. Once -the MDM

is in place, employees and applications access a consolidated view of the company's data. The implementation should

enforce standards for the formatting and storage of data, such as the num- her of fields in an address record or the number

of digits in a ZIP code. The service layer of the MDM preserves the view of the master data for applications and

synchronizes updates to the master file. In the case of Panasonic, the deployment of the IBM MDM software paid quick

dividends. Within a year and a half, Panasonic Europe was getting products to market faster and spending 50 percent less

time creating and maintaining product information. Time-to-market for a product was reduced from five to six months to

one to two months. According to internal calculations, Panasonic Europe improved its efficiency by a factor of 5 and

anticipates saving a million euros a year while increasing sales by 3.5 percent.

However, analyst Paul Jackson of Forrester Inc., cautioned against high expectations of boosted sales based on

data management improvements. He pointed to pricing, innovation, and strategic partnerships as better strategies for

long-term market share increases. When Panasonic North America had to reconcile its data, it did not have to confront the

challenge of multiple countries with multiple languages and currencies complicating product launches, as had its European

counterpart. However, the challenges of reorganizing workflow and consolidating product information were just as

daunting.

Panasonic faced this issue when it needed to provide a consolidated view of product information for retail giant

Wal-Mart. Panasonic started by identifying the information that Wal-Mart would need, which was data that adhered

closely to industry standards. Then, the electronics maker searched its legacy systems for the sources of the required data.

Finally, Panasonic worked with IBM to create an interface apparatus to collect the required data for a repository. Some

information, such as that produced by newer business processes, was not available in the legacy systems. Panasonic had to

add new interfaces in order to include this information and then build an application-integration layer to send the whole

package to Wal-Mart.

Each of the company's multiple facilities made its own contributions to new products. More importantly, the

facilities had their own cultures and information infrastructures. They also valued their autonomy and the flexibility it

furnished. Different Panasonic entities might be unwilling to give up control over information due to the perceived loss of

power. The company required clear music management rules to prevent too many hands from manipulating the data so

that the master would remain pristine. Panasonic North America Information Technology vice president Bob Schwartz

The Indian Institute of Business Management & Studies

Subject: Quantitaive Methods Marks: 100

19

hoped that the tierce competition threatening the standing of his company would help convince the traditionalists to

support data-sharing. However, he expected that convincing the enterprise of this would be an uphill battle.

Besides all the units of Panasonic North America, there were manufacturing partners to briny aboard. Without

them, the system could not fulfill its complete potential. This had been a serious challenge for Panasonic Europe, where

most partners were based in Asia and were content with their manual processes for managing product data Paul Bolton,

senior manager for e-commerce and customer relationship management solutions deployed the product information

database at Panasonic first. Once it proved effective, he then presented its capabilities to the other manufacturers and won

them over.Schwartz therefore had a strategy and a roadmap to clear that hurdle. What remained was perhaps the biggest

hurdle; convincing the corporate office in Japan that their data management strategy deserved global adoption. Only then

would the application of MDM principles achieve its full benefit. In the meantime, Schwartz reached out to Panasonic's

vendors in the U.S. and gained additional profits from the company's improved data. Panasonic was able to use the data to

reduce the amount of time that vendors such as Best Buy and Circuit City kept high-cost inventory, such as large-model

TVs, in stock from 35 to 7 days, thereby increasing their profit margins.

CASE VI QUESTIONS:

1. Evaluate Panasonic’s business strategy using the competitive forces and value chain models.

2. How did Panasonic’s information management problems affect its business performance and ability to

execute its strategy? What management, organization and technology factors were responsible for those

problems?

3. How did master data management address these problems? How effective was this solution?

4. What challenges did Panasonic face in implementing this solution?

No comments:

Post a Comment