R:Base Was Pure Dopeness

Posted in Uncategorized on October 20, 2014 – 3:07 pm
Comments (0)

dosWhen DOS 2.0 was brand-new in 1983, I was using Lotus 1-2-3 to maintain a mailing list. It was easier than using EDLIN and the DOS Sort utility, but it still seemed like a clumsy, error-prone approach.

That’s when I decided to get acquainted with my first database package — Microrim’s R:base. My reaction was love at first prompt.

That early experience with R:base gave me some convictions about the proper way to configure a database. Those convictions persist to this day, although I was often forced to use other tools because of a client’s preference.

When I recently had the chance to get reacquainted with R:base 4.0 under OS/2, I was happy to find that an old friend had kept all its good points while adding some useful new features.

I wasn’t aware of my R:base-inspired principles until I ran across other packages that didn’t uphold them. For example, this was quite apparent when I finally had to learn dBASE. It was required by a consulting project that specified the use of the then-new Clipper compiler for low-cost distribution of dBASE applications.

Not a true DBMS

I was dismayed to discover that dBASE, though billed as a database manager, was really just a record-oriented, BASIC-like interpreted language that worked with separate files of data, indices, formats and the like. That’s what it was then and that’s what it still is today.

I was spoiled by R:base, which was ahead of its time in treating a database as a conceptual whole. A database in R:base was — and still is — represented by exactly three files, no matter how many relational tables the database might contain. By contrast, trying to create a dBASE application was a pick-up-sticks exercise of looking through the source code to identify every required file.

Advocates of the dBASE approach might argue that certain tables, such as supplier information, might relate to several different applications — so it makes sense to make the table a separate file so any application can use it. This method ignores the first requirement of a database: to carefully collect and protect data.

When a data table is left to fend for itself in an unguarded file system, who knows what can be done to it by some other well-meaning program? With data in separate files, records are only as safe as the least carefully written application that uses them.

I was spoiled by R:base, which makes data-validation rules an integral part of table definition — not merely a hoped-for feature of every application.

Applications that use the same data should be treated as part of the database, rather than the data being treated as a mere accessory.

But as I said, I didn’t appreciate these niceties of good database practice until I encountered tools that didn’t have them. What made me such a fan of R:base in the first place was the product’s elegant approach to supporting new users who had yet to learn the SQL-like command language.

All you really had to learn was one word, “Prompt.” This put you into a top-level dialog box, which provided a tree of interactive menus, each defined by which database was open and what column it contained.

At each point, you saw a template of the command you were building, which became more complete as you responded to the prompts. Finally, you got the chance to select the menu option “Execute” and see the results.

It was the most painless process I’ve ever seen for learning this kind of thing: novices could quickly learn the simple cases, the advanced users could easily get help with new features, and anyone could use the prompted mode until they were comfortable typing queries on their own.

Today’s OS/2 R:base is multithreaded, fully SQL-compliant and has all sorts of other refinements. To me, though, it’s still that great product that taught me how to do a database 10 years ago, and it still does it better than most.

IBM Still Powers The IT Industry

Posted in Uncategorized on October 5, 2014 – 2:54 pm
Comments (0)

ibmspIBM is on a roll, at least as far as PCs are concerned. Granted, the company has just announced the Mother of all Corporate Losses, and John Akers has joined the ranks of presidents without portfolio. Nevertheless, there’s one part of the company that appears to have gotten its act together and is competing effectively — the Personal Computer Co.

The descent into the netherworld of non-competitiveness took a while: After all, IBM has always been a force, and what might have killed other companies just weakened Big Blue.

Yet from 1988 to the end of the third quarter of 1992, it was pretty darn hard to justify buying a Blue box, and fewer and fewer customers did. Prices ranged from merely exorbitant (50 percent more for the same configuration) to outrageous (250 percent more for similar systems).

Worse, the Entry Systems Division seemed to be run according to a single principle: “Ignore the competition and do things the same old way.”

Change was inevitable

Well, after seeing not only its market share drop precipitously, but its actual unit volumes as well, IBM started to change. The usual management changes occurred, of course, but more important was the development of a fundamentally new approach to the PC business, one that could make IBM competitive with Dell, Gateway and Compaq.

Compaq’s reinvention was a major motivation. After watching its nearest competitor make a quantum leap in its business model, IBM saw the writing on the wall: Change or die.

The most obvious result of this change is in the product line. The new products are actually competitive in features and, for the first time, price. It’s clear that IBM hasn’t achieved this by flushing quality, either.

Two product sets that stand out are the ThinkPad notebooks and the PS/ValuePoints. As a diehard notebook user, I find the ThinkPad 700C contains features that measure out well in the labs and in the field. This product is clearly driven by the world-class machines made by Toshiba, Compaq and NEC.

On the desktop, the ValuePoint line and its companion, the PS/1, are nothing to dismiss. In fact, finding a 486-based PS/1 was no small chore around Christmas. The ValuePoint hits the mark by combining the features and price necessary to sit on corporate America’s desks.

IBM generally had a good feature set in the past, but price was always its Achilles’ heel. So how did the price come down so much? This involves a peek behind the scenes of the new Personal Computer Co.

Although many observers tend to focus solely on product and technology engineering, a large part of IBM’s resurgence to a position of PC strength is based on the business and financial engineering going on there. At the root of this is the separate nature of the Personal Computer Co. When the decision was made to allow the business units to act more independently, this was the level playing field the PC folks needed. You wouldn’t believe the overheads the PC company was saddled with just for being part of IBM.

Bob Corrigan once made a great comment: “I told them I wouldn’t pay any overhead for the corporate jets. Besides, I never get to use them anyway.” It isn’t just corporate jets, either. While IBM doesn’t publish information on the extent of overheads or other product-cost issues, I’d guess it had $500 more in non-product cost and overhead allocations on some systems than its competitors.

This responsiveness to the market is also paying big dividends for customers. One clear manifestation of this is the flexible merchandising programs the company offers on its product lines. You the customer can decide if you want on-site, depot, or time and materials repairs. Software bundles are also broad and not limited to one operating system or set of applications. In fact, IBM is going to be doing some really exciting things with the non-hardware elements of buying a PC that should raise the bar for the Dells, Compaqs and Gateways.

This reincarnation is good for customers in another way. IBM, rather than being the easily beatable benchmark it’s been, is tough now. This means that other suppliers who want to compete will have to build even better systems and offer ever more attractive support and services.

Raising the general level of quality and performance for the entire industry is certainly a welcome sight, and one key benefit of a strong IBM PC offering.

Chipmakers Have Come A Long Way, Baby

Posted in Uncategorized on September 28, 2014 – 11:06 am
Comments (0)

cmaThe U.S. semiconductor industry has staged a startling comeback from its dark days of the mid-1980s.

It was in 1986 that American semiconductor companies, which had once enjoyed a 70 percent share of worldwide sales, watched their market share slip below 40 percent and their number of dynamic RAM manufacturers dwindle from 11 to two. Japan quickly capitalized on the erosion of the U.S. semiconductor industry and vaulted to the top.

“People seriously thought there wasn’t going to be a semiconductor industry in the United States by the end of [the 1980s],” said Dan Hutcheson, president of VLSI Research Inc., a San Jose, Calif., market-research firm. “Now, the semiconductor industry is one of the fastest-growing sectors of the U.S. economy.”

In 1992, American semiconductor companies once again surpassed Japan in worldwide sales by a margin of 43.8 percent to 43.1 percent, according to figures from VLSI Research.

How did the United States turn it around? Observers and analysts cite a series of events — both political and otherwise — that helped revive America’s semiconductor industry, including a U.S.-Japan trade agreement; the formation of the Semiconductor Manufacturing Technology Institute (Sematech), a U.S. semiconductor consortium; and Japan’s current economic hardships.

The turnaround started six years ago when the U.S. semiconductor industry acknowledged that it was second to the Japanese in the worldwide market.

“That was the wake-up call for us,” said Tom Beerman, president of the Semiconductor Industry Association, a trade organization also based in San Jose. “The U.S. industry as a whole realized it had to get its act together in terms of quality, better relationships with its suppliers and a whole variety of other areas.”

At the root of the U.S. decline, according to Beerman, were several “disturbing trends” that needed to be addressed. The first: Japan’s practice of dumping semiconductors into the U.S. market, which contributed to the loss of billions of dollars. Japanese producers were selling some forms of common memory chips at prices below cost in an effort to buy market share.

Trade pact helped

In an effort to halt this practice, the United States and Japan formed a trade agreement in 1986 that prohibited Japanese firms from illegally dumping semiconductors into the U.S. market. But the pact also gave the United States greater access to Japan’s burgeoning semiconductor market, which had essentially been closed to the United States, Beerman said.

“That trade agreement was a significant factor in our ability to come back against Japan,” he said. “Companies once again had the confidence to invest in U.S. semiconductor manufacturers.”

Another galvanizing force in the U.S. comeback, according to observers, has been Sematech. The U.S. government-sponsored consortium was formed in 1987 in an effort to save the nation’s semiconductor industry.

“The real clear intervention by Sematech helped improve the quality of the equipment and develop more learning about higher yields,” said VLSI Research’s Hutcheson. “Eventually, those things tied together and drove the U.S. semiconductor industry.”

Finally, Japan’s struggling economy and its own semiconductor strategy played a role. Japan’s semiconductor strategy revolves around the mainframe computer business, according to Hutcheson, while the United States molded its semiconductor industry around the PC.

“Look where we’d be if we had built ourselves around IBM’s and Digital Equipment Corp.’s [mainframe and minicomputer platforms]. We’d have completely shot ourselves in the foot,” Hutcheson said. “[IBM and DEC] certainly aren’t the pillars of strength that they used to be five to 10 years ago.

“Today, we’re seeing companies like Intel Corp., Advanced Micro Devices Inc. [AMD] and National Semiconductor Corp. feeding from more capital investment in semiconductors,” he said. “We’ve returned to a more competitive manufacturing base and a more focused market.”

Nowhere is the comeback more evident than in the financial success of Intel, which in January reported $429 million in earnings on sales of $1.9 billion for its fourth quarter ended Dec. 26, more than doubling earnings from the year-ago quarter. National Semiconductor, AMD and Cyrix Corp. also posted strong earnings in 1992.

Analysts project continued strong growth for the U.S. semiconductor industry, not only in the microprocessor segment, but also in the complex logic arena, which includes such technologies as digital-signal processing, digital speech compression and digital filtering.

In fact, according to Drew Peck, an analyst with investment bank Donaldson, Lufkin & Jenrette Inc. in New York, the future of the U.S. semiconductor market lies in digital-signal processors. This is an area where companies such as Motorola Inc., Analog Devices Inc. and LSI Logic Corp. are carving out a niche for themselves, said Peck.

“Digital-signal processing components may surpass microprocessors in unit volume and perhaps market size by the end of [the century],” Peck said.

While the U.S. semiconductor industry looks toward the future, it also keeps a wary eye on the not-so-distant past and, in particular, 1986. “It was quite a catastrophe,” said Peck.

Oracle “Glue’s” It All Together

Posted in Uncategorized on September 15, 2014 – 10:44 am
Comments (0)

orclOracle Corp. customers will see the benefit of the company’s new Glue middleware immediately: Glue’s architecture is built to take advantage of the optimization features of the Oracle database server. But while Glue has a strong architecture, its success will depend on its acceptance and use by other software vendors.

PC Week Labs examined a beta version of Oracle’s middleware, a software layer that allows front-end software to talk to back-end databases. Glue takes middleware a step further by including E-mail and personal digital assistants (PDAs) as data sources that can be linked into the corporate data network.

Oracle’s short-term plan for Glue, which is set for release in March, is to use the product as an API (application programming interface) that is optimized for the Oracle database server and easily implemented from any front-end software.

In the long term, Oracle will pit Glue directly against Open Database Connectivity (ODBC), Microsoft Corp.’s middleware solution, and the Integrated Database API, middleware technology from a consortium of vendors that includes Borland International Inc., IBM and Novell Inc.

Oracle officials said they plan to release a driver for an ODBC connection in late 1993, which will give Glue users the ability to connect to popular databases such as Sybase Inc.’s Sybase and IBM’s DB2. In the interim, however, users will have to look to other middleware solutions to meet their database connection needs.

PC Week Labs found many of Glue’s Structured Query Language (SQL) commands easy to use and more compact than comparable commands used with Microsoft’s ODBC.

For example, ODBC required six lines of code to select records from an employee database and load them into a Visual Basic application, whereas Glue required only one line.

One reason Glue uses less code for tasks is its use of “containers,” memory areas into which Glue will put data selected from the database.

With ODBC, a memory area must be defined and then loaded with data, while a single Glue command automatically selects data and loads it into a memory area.

Oracle Glue’s API consists of 71 SQL commands grouped into a database, the Oracle SQL command set, and mail and PDA subsets.

PC Week Labs successfully loaded a Microsoft Excel spreadsheet with data from an Oracle database. Using only four Glue SQL commands embedded in an Excel macro, we were able to connect to the server, select a subset of data and disconnect from the server.

We were also able to use Microsoft’s Visual Basic application development environment to select and display data, both from an Oracle database and from a Sharp Electronics Corp. Wizard PDA.

The Glue PDA commands allowed us to use Glue to interact with the Sharp Wizard.

While the Wizard was in PC Link Mode and connected to our PC via a cable, we were able to read, record-by-record, from the Wizard and load this data into a Visual Basic form.

A middleware connection to various mail systems is not as important as providing connections to database servers, but it is nice to know that Oracle’s New Technology division is doing some advanced research in the area.

Glue currently works only with Oracle Mail, which is not surprising since Oracle Mail is built as a front end to the Oracle database.

In tests, the Labs was able to extract mail messages from Oracle Mail through a Visual Basic application with only a few Glue commands.

Oracle will have to devote a lot of development time to deliver connections to other mail systems, and the company will more than likely wait for a messaging API such as Vendor-Independent Messaging or Messaging Application Programming Interface (MAPI) to become mainstream.

Glue’s ability to connect Oracle Server to PDAs, or palmtops, is another area that might not have as much use today as it will in the future. Testing methodology

PC Week Labs tested Oracle Glue’s ability to connect Oracle Server with a variety of front ends, including Microsoft’s Excel. We also examined Glue’s E-mail capabilities as well as its ability to connect to PDAs.

We tested Oracle Glue on a 25MHz 386-based Compaq Computer Corp. 386/25 with 16M bytes of RAM and a 200M-byte hard-disk drive.

We ran version 6.0 of the Oracle database on a Sun Microsystems Inc. SPARCstation 1 running Unix.

Oracle Mail was run on a Sequent Computer Systems Inc. S81, a 486DX-based, 25MHz machine runing Sequent’s Dynix 3.11 operating system. The workstation was connected via Ethernet using TCP/IP as a protocol.

Tough Times For Windows Developers

Posted in Uncategorized on September 5, 2014 – 10:34 am
Comments (0)

 “Survival of the Fitters.”

wdevI don’t know whom to credit for that wonderful line. I first heard it from Fred Snow, a vice president with the distributor TechData, but he says he picked it up somewhere else.

In any case, I’ve been stealing the line with glee, for not only is it a nice play on words, it’s a perfect description of the PC marketplace these days. It describes the results of both last year’s downward pricing pressure on PC hardware vendors and the market’s swing to Windows applications on software vendors.

That phrase came to me again last month. I’d gone off into the woods, so to speak, for a week, to get away from my daily routine and think about where management of the corporate computing function is headed.

“Survival of the Fitters” indeed. How better could we describe the upheavals in IS management over the past decade — and especially the last two or three years?

So many of the themes that have defined business computing during the ’80s and ’90s are really variations on that idea.

With the advent of PCs, savvy IS managers steered a course away from shared-logic systems toward putting power on the desktop.

With the advent of usable LAN technology, smart IS people relinked those desktops in a different kind of grid — what a friend of mine who runs IS at a southeastern bank calls a “power grid” — via LANs.

With the increasing power of servers and super servers, and greater throughput throughout the network, IS staffers are now going through the process of downsizing and rightsizing, delivering to their users levels of power and information access they’ve never had.

In sum, the fitters have survived.

Culture clash

That’s been at least as profound a cultural change as it has been a technical one. IS people in the ’60s and ’70s were not exactly known for their flexibility. Anyone who worked with a typical corporate MIS department circa 1970 remembers all too well the inflexibility that seemed almost genetic: Was there, somewhere on an org chart of the IS department, perhaps a box marked Director of Intransigence?

The advent of desktop systems and LANs didn’t change the attitudes of some IS people, of course. A few are still installed in positions where they can be genuinely dangerous to their companies’ health; others remain in their departments but have been shuffled off to safer, meaningless, “administrivia” jobs.

Many more of the old IS hard-liners (read: manifest non-fitters) have retired. And some downsized themselves: Seeing the handwriting on the wall, they exited their corporate jobs for IS positions at smaller companies.

In other words, the non-fitters did not survive.

Make no mistake: This has been a harshly Darwinian period. Those who adapted, survived; those who did not, disappeared. Not since the advent of the modern corporate IS department in the 1950s had there been such a challenge to IS staffs; never have so many IS people been swept out by the winds of change.

On my retreat, I was struck that today many IS people find themselves in the early stages of a similar encounter. But the contest isn’t between flexibility and inflexibility — who would argue today for inflexibility as a virtue? — so much as it is between embracing diversity and resisting or complaining about it.

We can no longer flee from mixed-hardware environments — PCs, Macs, workstations, mainframes, minis — or mixed-software environments — Windows, OS/2, NT, NetWare, Univel connected to NetWare, etc.

And we’ve got to stop bitching about them.

When I write a follow-up to this column in the issue of Feb. 3, 2003, it’s going to be about those IS managers and staffers who embraced diversity, and by making its mastery the hallmarks of their departments, not only survived but prospered in the 1990s.

And about those who, baffled by an exploding universe of computing hardware and software standards, tried to bar the door — and have disappeared.

Reviewing Classic DEC Hardware

Posted in Uncategorized on August 25, 2014 – 6:57 am
Comments (0)

rcdhDigital Equipment Corp.’s upcoming desktop systems, based on its new Alpha processors, will be the dream machines of the Windows NT world, according to an examination of a preproduction unit by Geekstir.

Expected to be released when Microsoft Corp.’s Windows NT is announced in the second quarter, the Alpha AXP 21064-based system examined by the Labs uses a minitower case and standard PC components and will cost between $7,000 and $10,000, depending on configuration. DEC also has plans for both lower- and higher-end Alpha systems.

Side-by-side comparisons with a 25/50MHz 486DX2-based system running our test release of Windows NT were no contest. The Alpha system, still far from final optimal condition, was considerably faster than the 486 PC. We compared the Alpha and 486 systems by running simultaneous generations of fractal images on each, using the fractal demonstration program included with the Windows NT Software Development Kit (SDK).

The test was heavily floating-point-math intensive, stressing one of Alpha’s strong features.

PC Week Labs’ test system ran at 125MHz, short of the 150MHz expected in release-level systems. The NT Alpha compiler — and therefore the applications we tested — still needs much performance tuning.

DEC expects to offer higher-end 200MHz Alpha systems by the end of the year and has discussed plans for even higher clock rates. With the exception of Sequent Computer Systems Inc.’s multiprocessing machines, we would be surprised if Alpha does not end up the fastest NT platform on the market. No other vendor in the microprocessor market has been able to achieve quantity shipments of products running at such speeds.

VMS-based Alpha systems running at 150MHz have been available since November. An Alpha version of the OSF/1 implementation of Unix is expected toward the end of the year.

Power under the hood

Some of the more interesting characteristics of the Alpha test system are found “under the hood.” Removing the cover revealed a gargantuan chip, measuring almost 2.7 inches square, covered by a heavily finned heat sink. The heat sink is attached to two threaded pillars rising out of the chip package itself.

512K bytes of high-speed static RAM cache surround the chip, and eight sockets for standard 72-pin single in-line memory modules are accessible nearby.

We were impressed with the design of the case, which did an admirable and quiet job of cooling a system that generates a great deal of heat. A large fan sucks air through vents in the back of the system, across the memory subsystem and processor, and out the front. Even when we ran the AXP system at 125MHz, the Alpha chip remained cool to the touch.

Although the same case is used for some of DEC’s 486-based systems, it has been designed to accommodate high-frequency chips while still receiving a Federal Communications Commission Class B certification.

The standard system configuration contains 16M bytes of RAM, expandable to 128M bytes. The system we tested had 32M bytes of RAM and a 1G-byte SCSI (Small Computer System Interface) hard drive.

The system also had six Extended Industry Standard Architecture (EISA) slots, containing both EISA and ISA adapters. An Adaptec Inc. 1740 SCSI adapter and a Compaq Computer Corp. QVision video adapter were included in our system.

At the Comdex trade show last fall, DEC demonstrated a Creative Labs Inc. Sound Blaster ISA sound card in a similar Alpha system.

Interestingly enough, DEC will supply QVision as an option in EISA-based Alpha systems.

DEC is also considering plans to support Intel Corp.’s Peripheral Component Interconnect (PCI) local bus in future systems, including plans to integrate a PCI controller in low-cost implementations of the Alpha processor. However, DEC has no plans to implement the Video Electronics Standards Association’s VL-Bus specification in any Alpha systems, officials from the Maynard, Mass., manufacturer said.

Other features of the system were pleasingly conventional, including two serial ports, a parallel port, and PS/2-style mouse and keyboard ports. The form factor of the motherboard was standard mini-AT size, making system design easy for vendors that wish to license Alpha system boards from DEC (assuming they can cool the board and maintain Class B certification in another box).

DEC officials said they plan to ship a final Alpha compiler for NT late this month or in early March. They also are pursuing talks with Microsoft to include Alpha development tools in the Windows NT SDK.

Inclusion of such tools in the standard SDK, combined with aggressive marketing to developers, could enhance Alpha’s credibility as a mainstream NT platform by facilitating Alpha versions of NT applications and development tools. The ease with which an Alpha version can be created (generally just a recompilation) and distributed on CD ROM should also facilitate Alpha versions of NT software.

IBM Shifts Its Sales Force Into High Gear

Posted in Uncategorized on August 5, 2014 – 6:45 am
Comments (0)

ibmsWhile all eyes focused on the management shake-up at IBM last week, the struggling computer giant was zooming in on another target: the vaunted IBM sales force.

IBM is quietly working to further split its “Blue Suit” sales team along product, geographic or vertical-market lines, sources said last week. The moves continued amid a week of turmoil following John Akers’ announced plans to resign as the company’s CEO (see story, Page 14).

Much like the increased autonomy being given to IBM’s product units, dividing the sales force will give IBM’s developers more freedom to sell their own products instead of relying on one group to peddle everything from Token-Ring adapters to mainframes.

The single sales force — once a driving force behind IBM’s success — has become an albatross because it lacks the training and motivation to sell customers PC- and LAN-based products instead of higher-profit mainframes and minicomputers, observers said.

“The local IBM marketing guys … had to have a broad knowledge of everything and not a specific knowledge of anything,” said Steven Verne, a PC specialist at Tree Top Inc., a Selah, Wash., fruit-juice company. “It just hurt our confidence in them.”

IBM won’t disband the direct-sales force entirely because the group’s relationship with large corporate accounts is “one of our fundamental strengths,” said David Thomas, general manager of marketing for IBM U.S. in White Plains, N.Y.

But Thomas acknowledged the need to focus more on products, saying his greatest challenge in 1993 is to increase the number of “marketing specialists” within the sales force. Those specialists will be trained to sell specific products, he said.

IBM U.S. is now devoting one-third of its sales force to the account representatives who have overall responsibility for a customer’s needs, one-third to increasing the product-specific marketing specialists and one-third to sales representatives whose job it is to sell services, said Thomas.

The segmentation of the sales force has been under way for two years and will continue to evolve, an IBM spokesman said.

Sources said the sales force could also be restructured to focus on vertical markets, a move taken last month by Integrated Systems Solutions Corp., IBM’s systems-integration subsidiary.

Several IBM business units have already begun narrowing the focus of its sales teams, a move that IBM watchers see as a natural progression.

“The more autonomous you become, the more the lines of business will want to control their own destiny,” said Bill Grabe, a former top executive in the IBM U.S. marketing and services organization, which runs the direct sales force. For example, Pennant Systems, a wholly owned subsidiary of IBM that makes high-end printers, has had its own sales force since early 1992.

The IBM Personal Computer Co. is represented by an unspecified number of marketing specialists known as “fighter pilots,” officials said.

And the Networking Systems line of business is rolling out nearly 500 marketing specialists who have received specific training in IBM’s networking products, said Steven Joyce, manager of APPC market enablement at Networking Systems in Research Triangle Park, N.C.

The job of those salespeople will be to sell products such as IBM’s APPC (Advanced Program-to-Program Communications) protocol. “We have done a miserable job of teaching [the sales force] about how to sell APPC,” said Joyce. To address that, the APPC developers have stepped up efforts to teach IBM sales reps — and customers — about the protocol.

Joyce’s 32-person “market-enablement” group talks up APPC at trade shows, runs a CompuServe bulletin board and lets vendors bring their products to the Research Triangle Park development site for compatibility testing.

IBM’s Toronto software laboratory also sends marketing teams to trade shows and distributes a customer newsletter about its C Set/2 C compiler and debugger, said Development Manager Hester Ngo. “Products like ours always received the least amount of attention from the IBM [sales force],” Ngo said, “because they always focused on hardware, [which] brought in the most incentives.”

One customer said IBM “missed the boat” with its reliance on a single sales force.

“What’s important to the customer is they get top-quality service,” said Lee Batson, computer systems manager for Habif, Arogeti and Wynne, an Atlanta accounting firm. “I don’t care if I have three or four IBM sales reps, as long as … they know their products and their industries.”