Forget Paying for Expensive Data Recovery Ever Again

By reading this page it’s probably safe to assume you’re in a bit of a pickle. When someone finds the need to look for data recovery services, it usually means something unfortunate has happened to their data.

Be smart when choosing a data recovery provider.

Be smart when choosing a data recovery provider.


If you’ve been looking around for a while now, you’ll have noticed that there’s a wide range of possibilities for data recovery. Most of them are expensive, some are affordable, and others are almost unbelievably cheap.

But be warned, venturing into the world of data recovery unprepared is asking for trouble. Do your research and try to find out why each company is charging what it does. If you put in the time now, you may be able to find a deal that’s actually what it claims to be: cheap data recovery.

Unfortunately, failing to perform due diligence can quickly make matters worse. For example, if you send a drive off to one of these companies you may discover they are unable to help you. They’ll inevitably send it back, but now you have a new problem on your hands.

Now you’ve not only wasted your time, but you’ve likely made life more complicated for yourself. Because your drive has already been inspected, to have the data recovered you are now paying for a “second attempt” or “second assessment” fee.

What’s more, the reason your drive was sent back is because the recovery has already been deemed to difficult to do cheaply. You now have an expensive repair, made worse as you are looking to have a second attempt done by another company.

However, if you perform a few simple checks Continue reading

Posted in Computer | Leave a comment

ATM Networks Were Killer Apps For Their Time

atmnwAfter about three years of being sold on the idea of ATM networking, we’re still waiting for it to arrive in earnest. The myth of what ATM will do eclipses the reality of what ATM is currently capable of doing.

Asynchronous transfer mode hardware still doesn’t support voice traffic, so the goal of data/voice network integration remains in the distance. And though MPEG has been emerging as a standard video- transmission format, the H.320 and JPEG video standards are still unsupported.

Scalability in network size is also not really available, outside of a choice between 155M-byte OC-3 and 100M-byte TAXI (Transparent Asynchronous Transceiver Interface). Investing in large-scale ATM switches is not practical until all of the ATM standards have stabilized.

There have been quite a few announcements of WAN-capable ATM switches, but vendors are waiting for ATM to become more available from the public carrier services. LAN ATM Continue reading

Posted in Admin Tips | Leave a comment

Early Quality Control Problems Were Major Issues For PC And Auto Manufacturers

ibmscCompaq finds its stocks contaminated with faulty power supplies. Disney gets eaten alive by crashing “Lion King” CD ROMs. Intel makes Pentiums that can’t handle division. IBM short circuits with defective ThinkPad power adapters. H&R Block and Intuit stir a taxpayers’ revolt. It almost sounds like deja vu–to the days when poor quality sent U.S. automakers skidding out of control. The PC industry continues to cruise, but the recent spate of product flaws raises a crucial issue: Is the pressure to pump ou t more, faster, cheaper forcing vendors to fall beneath their own wheels by breaking the speed limits on quality?

No one is saying poor quality is about to besmirch the PC industry to the extent that it tainted Detroit. “Most of these companies are more focused on quality than in the past,” says Roy Bauer, a former IBM executive who’s now an examiner for the Malcolm Baldrige National Quality Award. Nevertheless, “they’re caught up in the explosion in demand, and when that happens you try to take shortcuts.” Adds Michael Slater, editor of the Microprocessor Report: “Everyone is trying to shave a penny here or there, an d quality costs money.”

But poor quality costs even more. Intel ate $475 million to write off its wayward Pentium chips. Intuit took a charge of $1.3 million for its glitchy MacInTax and TurboTax. And pity poor Golden Systems. With only $29 million in annual revenue last year, the Simi Valley, Calif., company had to book a $5 million Continue reading

Posted in Computer | Leave a comment

Reasons to Hire a Professional to Fix A Broken Hard Drive

Happy technician working on tablet pc beside servers in data centerIt is definitely bad news if your hard drive broke because you were investigating why the disk was failing in the first place. This is why instead of attempting to fix a broken hard drive, it is much better to hire a professional to look into it. Although fixing your hard drive can be really costly, you will at least be able to ensure that the problem is provided with a sure-fire solution. You will not also end up buying a new drive, especially if you find out that it was just a simple problem made worse by attempting to resolve the issue without consulting an expert. If you are going to hire a professional for the job, you can be sure that you will spend less time trying to figure out where the problem lies. Experts already know how to fix your broken hard drive, so why risk it?

You will also be given the right diagnosis and some preventive measures so you will not encounter the same problems with your disk again. For you to make sure that you fix broken hard drive successfully, you also may want to get some recommendations from friends. You may also read customer reviews or check important forums and check the overall ratings of data recovery companies (some tips are here) so you will have an idea if you are dealing with experts. You can surely bring your hard drive to life when you choose a professional to do the job.

How Does Data Recovery Software Work?

What will you do once some of your key data was lost? Should you go ahead and send your hard drive for data recovery service? The answer, in some cases, may be no. This is only done when a hard drive has a physical problem and if the problem was stopping you to access your files. But if the issue is more of a logical problem like the files were deleted or the hard drive was accidentally formatted, the best way to go is get free data recovery software. Luckily, there is cheap data recovery software you can get in just a click.

How They Work

A data recovery software works by recovering formatted partitions with original file names and storage paths. There is a lot of software you can try but there are only a few data recovery tools out there that you can trust. Some are even free. Besides, one does not really have to spend much on data recovery unless the data is very important. All you have to do is launch the downloaded software and select the files you wish to recover. Once you have selected the disk where data was lost, the software will find the files through scanning. Some solid tools are written about here.

Should You Hire A Professional For Server Data Recovery?

There is nothing as challenging as finding a solution for recovering data from a failing server array. The good news is data recovery is definitely possible in the right hands, but you need to be sure that the hard drive recovery service company is certified by the company. As an example, if you had a failed drive or logical problem with your Dell server, you’re best to check this page for Dell Poweredge data recovery.  Professional level servers require professional hard drive recovery, and this is why it is best Continue reading

Posted in Admin Tips, Servers | 3 Comments

Samsung Did Well To Diversify Early

smsdWith memory chips accounting for more than 50 percent of Samsung Electronics’ profits last year, its masters in Seoul, South Korea, have decided it’s too dependent upon the highly cyclical DRAM business. So McDonald’s one-trick pony is racing into new lines of business and new technologies that will have a major impact on the U.S. market. By this summer, Samsung’s trickle of flat-panel displays will become a flood. Flash memory products will appear–in volume–later this year. The company expects to se ll nearly $1 billion worth of specialty memory chips, including SRAMs and EDO RAMs, in 1995. And last month’s agreement to make a $377 million investment in AST will change the PC market in the United States and Asia (see Close Up, right).

It’s not just Samsung Electronics that’s going through a transformation. The company’s parent, the $54 billion Samsung Group, is also diversifying and moving quickly to give foreign operating units more autonomy. Samsung Group Chairman Lee Kun Hee–invariably referred to as Chairman Lee–has decreed that Samsung will add automobiles, aerospace, transportation, and entertainment to its current repertoire of electronics, chemicals, and finance. Also on the capital budget: Samsung’s first fabs in the Unit ed States, Asia, and Europe. Although the facilities may be built under joint ventures, guess Continue reading

Posted in Computer | 3 Comments

Transaction Processing Creates Real Benefits

tpcOld guard, new beat: TP monitors avert gridlock by policing client/server traffic

It’s an IT manager’s biggest nightmare. On a dark and stormy night, a WAN link suddenly crashed, taking down the national client/server reservation system for a company that wishes to remain anonymous. Harried travel agents kept pounding transactions into the system, expecting it to come up. When it did, the system was immediately slammed by a barrage of pent-up traffic. Designed to handle hundreds of transactions per second, the system choked on thousands and it crashed once again.

This company’s harrowing experience illustrates a dirty little secret about client/server applications. When they’re spun across the enterprise, many can’t handle the high-transaction traffic or unpredictability of com-plex, production-level environments. Increasingly, some companies like the one above are turning to a tried-and-true mainframe approach to keep their new client/server systems afloat: transaction-processing monitors.

Like an old-time traffic cop, a TP monitor ushers client requests to data services in a secure, reliable, and high-performance manner. If the reservation system had been outfitted with one, the gridlock never would have happened. These monitors “give you mechanisms so you’re not at the mercy of the [client/server] environment, you’re more in control of it,” says Rich Finkelstein, president of Performance Computing Inc., a consultancy in Chicago.

Although they’ve been in mainframe systems for decades, TP monitors are just beginning to show up on the client/server radar screen. Until recently, most client/server applications have been too small to benefit from TP monitors, or users didn’t realize their value as a replacement for buying immature middleware or constructing custom systems in a piecemeal fashion.

TP monitors such as IBM CICS (Customer Information Control System) Open and Novell Inc.’s Tuxedo have the bandwidth to handle functions such as load balancing, security, dynamic routing, and access to multiple databases, to name a few. Many IS shops believe that running a high-volume client/server application without a TP monitor is akin to crossing a highway blindfolded.

For applications that handle millions of transactions per day, hundreds of users, and multiple, interconnected servers, “you’d be crazy not to use one of the monitors,” says Michael Prince, director of IS at Burlington Coat Factory Warehouse Inc., in Burlington, N.J.

Prince should know. Three years ago, he set out to move the clothing company’s inventory system from an aging mainframe to a client/server environment. At the time, it was doubtful whether a single relational database could handle the 150G bytes’ worth of data needed to track more than a million items carried in 200 stores.

To balance the load and assure a good response time, he and his crew split the application into 17 Oracle Corp. databases on four servers from Sequent Computer Systems Inc. Each database corresponded to a Burlington merchandise division, such as ladies’ coats or men’s outerware.

Despite the sound design, the system jammed whenever a single Oracle database fizzled.

In order to keep things moving, Prince installed a Novell Tuxedo TP monitor running on Unix. The monitor splits a single sales transaction–for example, the purchase of a suitcase and a bathrobe–and sends separate messages to the respective back-end databases. If one database is down, Tuxedo queues the message and delivers it once the server comes back up or reroutes the message to an available server. Meanwhile, the other database gets updated without delay.

“So [the system] is resilient and has high availability,” Prince says.

Another popular way to use TP monitors in C/S environments is to glue together “three-tiered” architectures in which a mainframe doles out data, a server runs application services, and the client makes things easy for the end user. Seventeen percent of mission- critical client/server applications are three-tiered, and it’s a growing trend, according to Standish Group International Inc., a market-research firm in Dennis, Mass.

Unum Life Insurance Co. of America is one company moving in this direction. The Portland, Maine, firm built a document-management system that employs IBM CICS for OS/2 to bridge a 3090 mainframe, a client/server system, and mainframe dumb terminals. People sitting at the dumb terminals can generate requests through the mainframe to the C/S system, and CICS routes the transactions between the tiers and lets terminal users access LAN resources that were otherwise unavailable, according to Bill Cook, a senior programmer analyst at Unum.

“What it does is extend the ability for our users to take advantage of client/server applications through their hardware infrastructure,” Cook says.

If not TP, what?

All this complexity has scared off some IT folks from using TP monitors when an application has a single database, simple transactions, or few clients.

“A TP monitor is going to add complexity. If you can run with the capabilities of the database and the hardware, and satisfy your volume requirements, that’s what you ought to do first,” says Daniel Amedro, vice president of MIS for Regency Systems Solutions, Hyatt Corp.’s technology division, in Oak Brook Terrace, Ill.

For example, Hyatt is upgrading its client/server reservation system to take advantage of Informix Software Inc.’s multithreaded database engines, which can meet the volume demands of roughly 1,000 booking agents without using a TP monitor, Amedro says.

Others favor TP-monitor alternatives such as RPCs (remote procedure calls) and stored database procedures. However, RPCs are not well-suited for load balancing and application-partitioning tasks, and stored procedures are limited to the database for which they were designed.

Mark Marcus, manager of advanced applications at Holiday Inn Worldwide headquarters in Atlanta, learned about the shortcomings of RPCs when he tapped them to handle decision-support system transactions between SCO Unix and UnixWare-based clients at hotel sites and Solaris-based servers in Atlanta. “That was real low-level, Unix-type coding. Now we have a TP monitor [Tuxedo] doing it out of the box, so we have a lot [fewer] proprietary processes,” Marcus says.

Likewise, stored procedures, which are built as database-application modules, are limited beyond load balancing and reliability functions. They can’t access resources outside the database, so users end up doing things in the client side, which can ultimately bog down performance, Finkelstein says.

Until client/server systems can mimic the mainframe’s transaction-processing predictability and reliability, most agree that TP monitoring can help. “It’s a concept that’s proven itself to be invaluable for decades. Trying to do transaction processing without a transaction monitor in any type of volume probably doesn’t make sense,” Prince says.

On the Transaction Processing Beat

A TP monitor helps large-scale C/S applications run smoother by directing the flow of transaction traffic from the client to the server in a reliable, secure and high-performance manner. Among its jobs:

Data integrity: to guarantee that updates happen on all relevant servers or no servers in the event one server goes down. The monitor can also queue a transaction until the download server is back on-line, assuring that the transaction occurs only once.

Security: to ensure that any given user or transaction is granted access only to specific resources.

Application partitioning: to minimize data traffic between the client and server by allowing users to partition applications to split the processing work between both platforms.

Load balancing: to distribute a transaction load across multiple servers to achieve higher throughput.

Posted in Servers | Leave a comment

Client/Server Apps Still Make For Corporate Advantages

cscrappsForaging for corporate gold? If you haven’t paid heed to business rules, you’re burying your head in the sand. These rules need tending like any other company asset.

At their most basic level, business rules are the tenets that reside in the management brain trust. In their finished form, business rules can be a line of code such as “Value=Qty*Price*(1- LOOKUP Order Discount)” on a database server, credit-limit guidelines on a server in the sales department, or commission structures housed on the old PDP-11 at corporate headquarters.

Given the complexity of client/server environments, companies can no longer afford to have these rules scattered throughout an organization. One solution: a three-tier architecture that splits the rules apart from the application logic and hands developers more flexibility in managing these new resources.

“When you begin to build systems, it’s key to think of business rules as the building blocks,” said Hugh Ryan, director of new-age architectures for Andersen Consulting Inc., in Chicago.

“They are the asset of the application,” echoed Continue reading

Posted in Servers | Leave a comment

Outside Project Managers Can Turbocharge Collaboration

tbccbYou know the old saying, “love is blind.” But the truth is that couples who don’t establish clear roles are headed for a wake-up call reminiscent of the infamous Al and Peg Bundy of the “Married with Children” sitcom. Al’s lack of physical and fiscal prowess is a constant frustration to his wife, while Peg’s irresponsible spending and refusal to play homemaker leave him cold.

The Bundys could have used a matchmaker.

What’s this have to do with client/server development? Everything. When building enterprise systems, either partner — the users or the developers — can easily be seduced by vague promises of increased productivity only to be crushed when reality settles in.

That’s where a matchmaker, or outside project manager, can play a role. It’s that person’s job to take a cold, hard look at the lovebirds and make sure they understand what they’re getting into.

To get a feel for this kind of relationship, PC Week recently spent a few days with project-manager trainees from BSG Corp. The Houston-based Continue reading

Posted in Admin Tips | 1 Comment