Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:

1korobkova_s_n_delovoe_obshchenie_etika_psikhologiya_filosofi-1

.pdf
Скачиваний:
4
Добавлен:
19.11.2019
Размер:
409.03 Кб
Скачать

Waterschoot, who also acts as database administrator in the company’s radar systems unit in Syracuse, N. Y., immediately alerted her boss and the unit’s legal counsel. But there was one other department that had to be notified and fast. Waterschoot knew that if the document was still on the company’s Microsoft Mail server after 5 p. m., it would be backed up overnight and archived on tape, making it potentially recoverable by other employees. Furthermore, the server is shared by the prime contractor, Lockheed Martin, and other subcontractors, exposing the confidential data to even more potential competitors. Waterschoot and the legal counsel worked with their colleagues in IT to remove the document from the server and back it up onto a diskette, which they returned to the competitor. “I just tried to do the right thing,” she says.

As E business moves more and more business processes and transactions online, Waterschoot’s experience is a telling example of how information technology, and the people who manage it, are at the forefront of decisions with ethical implications. The debate over ethical standards in business isn’t new. What is new, or at least more apparent than ever, is IT’s central role in some of the most important business ethic issues of the day: privacy, the ownership of personal data, and the obligations created by extended E business partnerships. How have these controversies affected IT managers and others involved with technology? What ethical issues, if any, are business executives grappling with in connection with cutting edge IT? And where do IT people go for guidance on ethically ambiguous situations? Far from self evident, the answers may be critical to the development of the trust and integrity needed to succeed at E commerce and online business.

Changes in technology and business processes can outpace companies’ ability to consider their ethical implications or to train employees to deal with them. Few companies have formal programs like Lockheed Martin’s, which requires its 140,000 employees to complete one hour of ethics training every year. “It’s traditionally been seen as an add on ’ethics is nice, but let’s get back to work,’” says David Gebler, a principal at the Working Values Group in Boston, a consulting firm that’s developed ethics training programs for Chase Manhattan Bank, Procter & Gamble, Prudential, Raytheon, and other companies. “You have to bring ethics into your business context. And E business raises ethical issues that may have existed before, but not in such stark reality.”

141

One IT manager considers the quality of his work to have ethical implications. “The impact of the decisions I make on our company is scary,” says Frank Gillman, director of technology at Allen Matkins Leck Gamble & Mallory LLP, a large law firm in Los Angeles. Gillman says his decisions on anything from an outsourcing partner to a WAN vendor could be critical to the firm’s ability to operate and compete effectively. And that’s an ethical burden in itself. “IT people need professional training on more than just how to work on computers,” Gillman says. “We don’t do enough in that area. I wish we could do more.”

Many IT and business managers seem to take their cues about ethical conduct from the companies they work for. In an InformationWeek Research survey of 250 IT and business professionals, only 54% say they have a personal code for evaluating the ethical and moral implications of business decisions. Of those who do, 67% say it’s based on their company’s code of conduct; only personal experience polled higher (70%). An eye opening 93% of all respondents say they agree with all aspects of their company’s ethical code, and 96% say their company adheres to its code.

“I’m pretty shielded from those [ethical] questions by our human resources department,” says James Underwood, manager of IS at Canon Information Systems Inc. He’s alluding to collecting Internet firewall log data that reveals which Web sites employees visit. The HR department “is responsible for what’s ethical and legal as far as what they do with that information, and I’m happy to let them do that,” Underwood says. “The question of whether they use it in an ethical manner is up to them.”

A pragmatic view, to be sure, but is it a sound one? “IT people are the ones responsible for configuring technologies and systems that have ethical implications,” says R. Edward Freeman, business administration professor and director of the Olsson Center for Applied Ethics at the University of Virginia’s Darden School of Business, and co editor of the Blackwell Encyclopedic Dictionary Of Business Ethics (Blackwell, 1998). “They have to be more than the mechanics who keep it running. They need to understand that ethics is at the center of what they do.”

At a law firm where Frank Gillman formerly worked, one IT employee clearly didn’t understand that. The worker sold company owned disk drives for money to support his cocaine habit. Theft and drug use are bad enough, but that’s not what horrified Gillman. To cover his tracks, the employee seriously compromised the firm’s

142

IT integrity by removing the system’s data mirroring capability, giving the system the appearance of having the added memory provided by the missing drives. “Can you imagine what would have happened if the system crashed?” Gillman says.

In large part to address potential IT related liabilities, both inside and outside a company, a growing number of businesses have high level ethics executives or chief privacy officers to enforce company standards. “Our goal is to raise awareness, to be proactive and preventive rather than punitive,” says Tracy Carter Dougherty, Lockheed Martin’s director of ethics communication and training, part of a corporate level office of ethics and business conduct that reports to the chief operating officer and CEO. Lockheed Martin recently disciplined an employee who E mailed a chain letter to friends in the company and brought down a server that affected an entire business area. “When you hit that ‘Send’ key, there’s no getting it back,” says Dougherty. “You always have to be very mindful of the risks, and we depend on IT to tell us where the new risks are likely to be.”

One new area of risk has to do with the use of handheld devices such as cell phones or personal digital assistants while driving, which has been cited as a factor in a growing number of traffic accidents (see story, p. 46). Mike Vleisides, senior manager of application development at Aventis Pharmaceuticals Inc. in Parsippany, N. J., spearheaded a “pull off the highway” policy for the company’s 3,500 field sales reps seeking to download data while in their vehicles which Vleisides says constitutes about 90% of their working hours. “Our company would rather have our sales people pull over and spend 30 seconds using the devices safely than risk accident, injury, or worse using them while attempting to drive,” he says.

Another area of risk, perhaps the riskiest, has to do with collecting personal data. In the InformationWeek Research survey, 80% of respondents say their companies collect customer data. Yet only 60% say their companies have a publicly displayed policy on the privacy of customer data they collect. Just 6% of those surveyed say their companies sell data to third parties, though the percentage jumps to 11% among companies with revenue of $1 billion or more. Health care companies, which collect what may be the most sensitive customer data, have the highest percentage (9%) of companies selling data among five industries surveyed. Overall, 95% of respondents say their companies always adhere to their

143

privacy policies, and virtually everyone says their customers know when specific types of data are being collected.

“We will treat customer information in a way that our customers expect it to be treated,” says Robert Beason, outgoing CIO of the Southern Co., a $23 billion gas and utility holding company in Atlanta. Beason says the company has turned down third party offers to buy some of the data within its 7 terabyte data warehouse of information on 4 million customers. “It’s for our internal use, and we’re not going to sell information in the open marketplace without written approval from customers,” he says. “There has to be a business ethic that goes along with that.”

But what role should IT people play in determining that ethic? Like scientists, IT professionals are often accused of being more interested in results than ramifications. “When your job is building the best performing database you can, you don’t always think about the ethical implications of how that data will be used,” says Ed Altman, a former CIO at Metro Goldwyn Mayer Studios Inc. and now director of business development at integrator and staffing firm Metro Information Services. “The very people helping to create the [data privacy] problem don’t realize how bad it is.”

Yes they do, says Kathy Komer, president elect of the International DB2 User Group, a worldwide organization of users of IBM’s enterprise relational database. Or at least they’re aware of the controversy. “I don’t see anyone who takes managing databases lightly,” says Komer, a database administrator for a large Northeastern company. But the user group has no written ethical policy concerning data collection, nor does it advise companies on ethical considerations in dealing with personal data, Komer says.

For consumers, the line between well targeted marketing and privacy invasion has always been a fine one. Some argue the consumer privacy issue, when compared with the actual capabilities of online marketing technology today, is overblown. It’s rarely possible and almost never cost effective to segment customer data to the individual level. In the InformationWeek survey, 65% respondents say they segment customer data by product line, 46% by region, 41% by frequency of purchases, and 33% by profitability. “All this fuss about privacy policies is the political correctness of the 21st century,” says admitted contrarian Peter Fader, professor of marketing at the University of Pennsylvania’s Wharton School of Business. “Let the market dictate what’s good and bad. As technology advances, consumers also get smarter and more skeptical.”

144

Most everyone in E business agrees that questionable ethical moves that compromise customer privacy for short term marketing gain are bad for business in the long run. “Online business is entering a more mature phase, and the issue of who the customer trusts becomes more of a competitive differentiator,” says IBM chief privacy officer Harriet Pearson.

Lands’ End Inc. believes that its renowned customer loyalty depends heavily on trust, and the apparel retailer has one of the industry’s strictest online privacy policies. The company doesn’t send E mail promotions to its customers except by request and never sells or trades online customer data.

Lands’ End’s security audits include not only hacking tests on its firewalls, but ethical tests of IT and business employees in situations where data security could be compromised. “We test to make sure they make data available only to those who should see it,” says Linda Severson, director of business systems. “You have to have tests that continually challenge your security and privacy processes. Ethics has to become more of a way of life, not a one time policy posting.”

Trust between workers and employers is another key issue putting IT managers in the midst of ethical decisions. Most companies forbid employees using company computers to access Web sites with material that is pornographic, violent, or hate related. Dow Chemical Corp. fired 50 employees last year at a Freeport, Texas, facility for violating that rule. In the InformationWeek survey, more than half of the companies monitor their employees’ use of the Web (62%) and E mail (54%). Among companies larger than $1 billion, those figures jump to 77% and 70%. And consistent with respondents’ overwhelming agreement with their corporate policies, most IT people believe such monitoring is ethical.

“Speaking for myself, I think employees’ Web usage logs should be available,” says Dave Austin, human resources IS specialist at manufacturer Leggett & Platt Inc. in Carthage, Mo. “If an employee is doing a poor job and the manager can see that person has visited eBay 85 times in the past week, the manager should be able to say that’s not acceptable and must stop. And that should be explained to every employee up front.” But Austin also says reasonable personal use of the Net should be allowed, given the fact that many devote long hours and weekend time to their jobs.

Another question is, who should this data be available to? Canon Information Systems’ Underwood says he’s been approached by

145

department managers seeking a peek at the Web surfing habits of certain employees. “I said, ‘We have the information but you need to go to HR to get it.‘”

When accounting firm Clifton Gunderson LLP in Peoria, Ill., started generating monthly reports on Web site usage for its HR department, chief technology officer Matthew Camden, who studied business ethics while in graduate school at Loyola University, says he realized the technology manager producing the reports might be tempted to warn people whose names appeared on the list. “I said, ‘You may see people on the list who sit next to you, but you can’t do anything about it,’” Camden recalls. IT people need help in determining the proper ethical responses to ambiguous situations, he says, and IT and business managers need to provide that guidance. “It’s not enough to have a rule; you have to do what you can to make people follow it.”

Because of IT professionals’ access to sensitive data, they must often do more than ensure compliance with company policy. At the Allen Matkins law firm, an IT employee cleaning up logs in the firm’s contact management application noticed something amiss. In a space for comments and notes usually left blank, one attorney, unaware it was a shared application, had keyed in two credit card numbers, a savings account personal ID number, and the access code for his home security system. Director of technology Gillman told the employee to notify the attorney immediately.

Call it CYA ethics . “Whenever there’s a leak of information, one of the fingers of suspicion will be pointed at IT,” Gillman says. “The more you act like you can be trusted, the less you’ll be targeted.” Guarding data privacy takes on even more significance when supply chain partners share information online outside company walls, as Lockheed Martin’s Waterschoot knows. In industries such as automotive, high tech manufacturing, aerospace and defense, and many others, collaborators on one project can often be competitors on another. “You need to make darn sure that the information being exchanged online is the right information,” says

ethics director Dougherty.

More suppliers are designing key components for competitors and sharing those designs online, says Michael Bauer, a partner in CSC Consulting’s manufacturing practice and co author of E Supply Chain (Berrett Koehler Publishers, 2000). “I hear a lot of emphasis on shortening cycle times and finding tools for security, but I haven’t seen a lot of awareness or programs about the responsibilities

146

of partners in an electronic supply chain,” Bauer says. “The problem is human beings not because they’re malicious, but because they can be careless or ignorant about ethical implications. “

That may be the key: Most IT managers and executives agree there needs to be more training in ethics, especially now that IT has taken a central role in doing business. Indeed, thinking of business and ethics, or IT and ethics, as opposing forces may be a false dichotomy. “The whole idea of positioning ethics and profits as a trade off is like asking me if I want a heart or a lung,” says University of Virginia’s Freeman. “Well, I’m partial to both of them. “

The Upshot

IT is at the center of important ethical issues:

Most companies today monitor E mail and Web use, but who should have access to that data?

How involved should IT be in setting policies on collecting and selling personal data?

Many managers take ethical cues from their companies’ policies, but should companies provide training in ethics?

Темы для эссе

Размышления на заданную тему.

The ethics of data

Печатаеся по: Colkin Eileen.

Information Week, 05/14/2001 Issue 837.

As the debate over data privacy grows, many IT professionals who manage the information find themselves in the middle of the controversy.

Steve Hoberman works for one of the largest process— manufacturing companies on the East Coast. Hoberman is a data architect in the company’s IS division. He generates reports for the company’s marketing and sales executives based on information in its data warehouse. He considers himself an expert in database design and has a master’s degree in IT from Carnegie Mellon University.

You might think Hoberman would know all there is to know about his company’s data policies, or at least he’d be generally aware of what the company does with its data. But ask him whether his company sells data to third parties. “No,” he answers right off, but then, thinking about it, “I guess I don’t know for sure.” Does he care? “Not

147

really,” he says. “I’m an enabler. I solve people’s problems. “ Hoberman says he translates management’s need to interpret data into a tool executives can use to help the business. “Beyond that point,” he says, “I don’t care.”

As an IT professional, Hoberman isn’t unusual in his lack of awareness of his company’s data policies, nor is he unusual in his insistence that he doesn’t need to know those policies. As the debate over data privacy grows louder and more acrimonious, IT professionals increasingly find themselves in the middle of the controversy, whether they know it or not—and a disturbing number don’t know it. IT is what makes the collection, manipulation, and dissemination of data possible. And while data has been collected and sold since the invention of the abacus, the furious pace at which that industry has grown over the past 10 years can be laid directly at the feet of the technology industry.

What do professionals in the “data industry”—data marketers and vendors of database and data mining technology—think of the ethical implications, if any, of what they do? What about IT professionals who specialize in data storage and management? Many data marketers equate their ethical obligations with following the letter of the law. As for data technology vendors, by and large they feel removed from the issue. Many IT professionals, even some directly involved in creating marketing databases, profess an ignorance willed or not—of any implications to what they do outside their immediate business obligations.

The data industry has come under harsh review. There is a raft of federal and local laws under consideration to control the collection, sale, and use of data. American companies have yet to match the tougher privacy regulations already in place in Europe, while personal and class action litigation against businesses over data privacy issues is increasing. Privacy advocates, educators, and industry observers say it’s time for the data industry, and the IT community in general, to embrace the issue and drop the duck and cover mentality that pervades the controversy. “This whole area is a minefield,” says Brian Staff, marketing VP at database supplier Informix Corp., which was recently acquired by IBM.

Earlier this year, N2H2 Inc. learned about the politics of the privacy debate the hard way. The Seattle company, which provides 40% of the Internet filtering software used in U. S. schools, decided last year to enter a new business: the sale of aggregated data. In a partnership with marketing powerhouse Roper Starch Worldwide, N2H2 began

148

marketing the data, called Class Clicks, that its filtering tools collected on the Web site usage trends of elementary and high school students. The data contained no names or personal information and complied with the new federal Children’s Online Privacy Protection Act.

Yet N2H2’s new line of business brought such loud howls of protest from online privacy advocates that the company scrapped the effort in February. “We went above and beyond the call to make sure there was no way to trace anything back to a school or an individual,” says Ken Collins, N2H2’s director of analytic services. “It was all aggregated data, but it still triggered a bunch of flags in public perception. It was a confusing and chaotic mess.”

There’s no doubt that data marketers feel under scrutiny. “If we don’t get it right, and we allow abuses to happen, the whole industry will pay the price for years,” says Paul Gustafson, VP of business development and product management for IQCommerce Corp., which builds the IT behind online promotions for companies such as Unilever and Johnson & Johnson. “There’s an awful lot at stake, and we’re a long way away from having all the answers.”

Many companies in the consumer information business describe ethical business practices primarily in terms of complying with existing laws, such as the 31 year old Fair Credit Reporting Act or the recently ratified Gramm Leach Bliley Act that regulates consumer financial data. “We try to balance how we use information while complying with laws and regulations and doing things in an ethical manner,” says Rich Crutchfield, executive VP at Equifax Corp. in Atlanta, the world’s largest provider of credit data.

“There’s a tremendous amount of federal, state, and contract law out there dealing with privacy,” says David Lee, an executive VP at ChoicePoint Inc., which compiles public record information for insurance carriers, the FBI, and the U. S. Marshals, among other customers. “We view ourselves almost as a regulated industry.” At ChoicePoint, chief privacy officer Michael de Janes is also the company’s general counsel.

Clearly, business and government leaders aren’t satisfied with how data privacy has been handled so far. The growth of a management position known as the chief privacy officer is an attempt by companies—across many industries, not just those in the data business—to indemnify themselves against potential liability over data issues, both internal and external.

As well they should. Along with the privacy laws already on the books, there are 50 bills pending in Congress concerning privacy and more in state and local governments.

149

Data marketers are keenly aware of the growing momentum behind those legislative efforts, and what it might mean for their industry. “We want to avoid heavy handed regulation with unintended consequences,” says John Ford, chief privacy officer at Equifax. “Why use a vise grip when a pair of tweezers will do?”

One of the most controversial of the new privacy laws is the Health Insurance Portability and Accountability Act. Former President Clinton signed the bill into law in 1996, but Congress never devised specific rules governing medical data, so that onerous task was deferred to the Department of Health and Human Services. The department released 1,500 pages of rules in December (available at www. hhs. gov/ocr/regtext. html), Congress ratified them last month, and companies have two years to comply.

Patients are promised the ability to access their medical records; previously, that was allowed in only 28 states. Also, they can make changes to inaccuracies in their medical files. Health care entities covered under HIPAA must receive written consent from patients to use their medical data. Health care companies must also hire a privacy officer and train employees in how to handle the sensitive data. Those who misuse data face up to 10 years in prison and $250,000 in fines.

HIPAA won’t affect some data collection methods. Medical Marketing Service in Wood Dale, Ill., and A. Caldwell List Co. in Atlanta aren’t using official records or under the table schemes to gather the information they sell to data marketers. They get it voluntarily from ailment sufferers who respond to direct mail or online questionnaires that promise coupons, discounts, and samples in exchange for a bevy of personal data. “We not only get ailment information, but also data on college degrees, income, age, hobbies, address and phone number, if they have an American Express or Visa card, and whether they plan to travel or buy [specific] things in the next six months,” says Tori Weathersby, senior sales executive at A. Caldwell. For the millions who respond, they’re informed of the marketing possibilities on the questionnaires.

HIPAA also doesn’t cover dot coms, so when an individual fills out a health care assessment on a medical Web site, that information is fair game for any marketing efforts. “There’s a false sense of security that consumers and patients would have at an E health dot com,” says Paul Tang, chief medical information officer at the Palo Alto Medical Foundation, a health care provider and medical research group in northern California. Tang is also chairman of the

150