Posts Tagged encryption

FTP “Lack of Security” Exposed

Posted by on Monday, 24 January, 2011

Apollo Project CSM Simulator Computers and ConsolesFTP was designed as an easy mechanism for exchanging files between computers at a time when networks were new and information security was an immature science. In the 1970s, if you wanted to secure a server from unwanted access, you simply locked the computer room door. User access to data was controlled by the basic User ID and password scenario. (Right is a reminder of how much technology has advanced since the 1970s. The photograph,  taken December 11, 1975, is the Apollo Project CSM Simulator Computers and Consoles. Photo Courtesy of NASA.)

The Internet did not yet exist and the personal computer revolution was still a decade away.

Today, the security of business file transfers is of paramount importance. The exchange of business records between computing systems, between enterprises, and even across international borders has become critical to the global economy.

Yet, the original native FTP facility of TCP/IP wasn’t designed for the requirements of the modern, globally connected enterprise. FTP’s basic security mechanisms – the User ID and password — have long ago been outdated by advances in network sleuthing technologies, hackers, malware, and the proliferation of millions of network-attached users.

Risks associated with using native (standard) FTP include:

  • Native FTP does not encrypt data.
  • A user’s name and password are transferred in clear text when logging on and can therefore be easily recognized.
  • The use of FTP scripts or batch files leaves User IDs and passwords in the open, where they can easily be hacked.
  • FTP alone, does not meet compliance regulations. (For example: HIPAA, SOX, State Privacy Laws, etc.)
  • When using an FTP connection, the transferred data could “stray” to a remote computer and not arrive at their intended destination leaving your data exposed for third parties or hackers to intercept.
  • Conventional FTP does not natively maintain a record of file transfers.

The first step is to examine how FTP is being used in your organization. The next step is to identify how your organization needs to manage and secure everyone’s file transfers. The final step is to evaluate what type of Managed File Transfer Product your company needs.

For more information download our White Paper – Beyond FTP: Securing and Managing File Transfers.


Posted by on Wednesday, 6 October, 2010

According to a survey of 155 Qualified Security Assessors (QSAs) conducted by the Ponemon Institute, 60 percent of retailers lack the budgets to be fully compliant with the PCI DSS standards. As an example, the annual audit cost for a major retailer can be as high as $225,000.

According to the Ponemon Institute survey, restricting access to card data on a “need-to-know basis” (PCI DSS Requirement #7) is still the most important PCI DSS requirement, but also the most difficult to achieve.

QSAs reported that the three most common business reasons for storing cardholder data are:

  • Handling charge-backs
  • Providing customer service
  • Processing recurring subscriptions

In order to service these customer’s requirements, the credit card data must still be available for the various software applications. These industry processes require merchants to implement methods of protecting cardholders from theft.

Encryption the Best Technology

QSAs find the most significant threats to cardholder data are in merchant networks and databases. They believe firewalls, encryption for data at rest, and encryption for data in motion are the top three most effective technologies for achieving compliance.

Sixty percent of QSAs believe encryption is the best means to protect card data end-to-end. Forty-one percent of QSAs say that controlling access to encryption keys is the most difficult management task their clients face.

Getting a Handle on PCI Issues

So what’s the best way to both satisfy the requirements of PCI and still make secured data transparent to applications?

The strategy QSAs recommend is to lock down the cardholder data with technologies that:

  1. Restrict the access
  2. Encrypt the data
  3. Manage and control the encryption keys

These recommendations point to a need to make encryption and encryption-key access an integral part of the overall information system.

But too many organizations use ad hoc encryption/de-encryption utilities that slow processing, and often leave de-encrypted data in the open. In addition, without any integrated encryption key management process, there is really no security at all.  Unsecured encryption keys, just like data, can be lost, stolen, and misused. Access to those keys should be managed as an integral part of the overall security of the operating system.

The point is that the QSA’s three recommendations go beyond the basic requirements of the PCI standard to actually secure the credit card data at the host – and to ensure that the data isn’t misused when the data is at rest or while being transferred.

Linoma Software’s data encryption suite Crypto Complete successfully addresses these QSA PCI requirements by providing data encryption and key management services that can be integrated seamlessly with IBM i (iSeries) applications.

Building on PCI-DSS V2

Industry security analysts will still complain that PCI-DSS needs to be a real security standard aimed at protecting card holder data, but Version 2.0 doesn’t provide that value.  Consequently, we need to analyze what the QSAs are recommending, and then build on PCI-DSS Version 2.0 to implement the best possible data security for our customers’ credit card data.

Thomas Stockwell

Thomas M. Stockwell is one of Linoma Software's subject matter experts and a top blogger in the industry. He is Principle Analyst at IT Incendiary, with more than 20 years of experience in IT as a Systems Analyst, Engineer, and IS Director.

More Posts - Website

Who Insures the Insurer?

Posted by on Monday, 2 August, 2010

Do insurance companies maintain Data Security Breach Insurance?

On June 23, 2010 more than 200,000 Anthem Blue Cross customers received letters informing them that their personal information might have been accessed during a security breach of the company’s website. Customers who had pending insurance applications in the system are currently being contacted because information was viewed through an on-line tool that allows users to track the status of their application. Social Security and credit card numbers were potentially viewed.  It’s one more tumble in a cascade of security breaches that can have terrible consequences for the customers and clients of such a large insurance company.

And of course, this raises an ironic question: Do insurance companies maintain their own liability insurance in the event that their information systems are compromised?  As absurd as it may seem at first glance, it’s really not a laughing matter. According to the Ponemon Institute, the average cost of a security breach is now exceeding $200 per client record.  This would mean that Anthem Blue Cross’s breach last month created a liability as great as $40M.

Moreover, there’s a ripple effect to organizations that do business with insurance companies that suffer such an information security breach.  Each Personnel Department that delivers private employee information to an outside service supplier has an inherent responsibility and liability to its employees.

We all know that the privacy information transferred between companies should use a secure and confidential method of transmission.  Yet too many small and medium-sized companies are still using simple FTP (File Transfer Protocol) software that has been proven to be susceptible to the threats of network hackers.  And by the time these organizations realize their vulnerability, it’s often too late.  These companies are often performing these FTP transfers below the radar of their IT departments.  How does it happen?

Often personnel data is off-loaded to PCs from the main information systems where it is left “in the open” on the hard drives of desktops or laptops. After the data is transferred this residual data is often unprotected, where it’s subject to theft or secondary security flaws. Insurance agents – whose jobs are to facilitate the processing of the data with their insurance providers – can also suffer from such breaches. The loss of an agent’s laptop – through theft, accident, or routine use of USB thumb-drives – poses additional liability.

There are two readily available strategies to help prevent these kinds of security abuses. The first strategy is to use data encryption technologies that not only encrypt the data, but also record into a secure log detailing when, where, and by whom the sensitive data has moved from the main information database.  Linoma’s CryptoComplete offers precisely this kind of encryption capability, and it should be examined by IT professionals as a viable, highly configurable resource for the protection of the company’s information assets.

The second strategy is to use a secure method of transfer for the data itself, ensuring that the information is never left in a vulnerable state on an individual’s personal computer.  By removing FTP access to the data by any employee’s PC and channeling the transfer through the secure corporate server, IT can prevent the problem of network hacking from occurring.  Linoma’s GoAnywhere Director solution is precisely the means of achieving the goal of a secure FTP transfer between companies.

The tragedy of the Anthem Blue Cross breach was the result of a faulty security scheme in the design of its customer service solution.  But it is not the only potential failure of data security that can impact its customers and business partners. And, unfortunately, this information security breach is just one of the 356 million reported breaches that have occurred in the US over the last five years.

So who insures the insurer when a data security breach occurs?  The real answer is IT itself.  And helping IT achieve a better result will be the subject of this blog over the next few months.

Thomas Stockwell

Thomas M. Stockwell is one of Linoma Software's subject matter experts and a top blogger in the industry. He is Principle Analyst at IT Incendiary, with more than 20 years of experience in IT as a Systems Analyst, Engineer, and IS Director.

More Posts - Website

1.800.949.4696  |  |  privacy policy
Copyright ©1994 - 2015 Linoma Software  |  All rights reserved