Posts Tagged Sensitive Data

How Managed File Transfer Changed My Life

Posted by on Tuesday, 24 January, 2012
In addition to being one of Linoma Software’s expert bloggers, Daniel Cheney is also in the IT trenches, and it was here that he first discovered the impact the switch to a managed file transfer solution had on his daily work life.
_ _ _ _ _ _ _ _ _ _

As a technology administrator at a major healthcare administration company, sending and receiving sensitive files between various systems used to be a daily grind and a consistent source of stress. We were using PC-based freeware FTP tools and the built in FTP functions on the IBM iSeries. The best we could do with scripting was to use CL command scripts to call the FTP function and hard code the login information. RPG programs would then call the CL scripts and retrieve and send the needed files, but there were insufficient logs and alerts for such automated activities.

managed file transfer, secure file transfer, secure ftpThe biggest headache for me was that these scripts, and the resultant sending of files, had to be error-free and reliable! Add to that the pressure of knowing how critical exchanging files is to the operation of the business and the challenge of  having a single person responsible for its success — it all became an unrealistic expectation.  On top of this, because most of these files are sent over the Internet, and because of the inadequate tools we had at hand, the security of our FTP processes was insufficient.

I knew it was time to find a better solution and after doing some evaluation of available managed file transfer products for IBM iSeries, I selected GoAnywhere™ Director from Linoma Software.

Our installation of GoAnywhere Director made a huge difference almost immediately.

First, Director provides me with all the possible security protocols available, including SFTP, FTPS, and standard FTP with PGP encryption.  It also has powerful scripting functions to login to HTTP and HTTPS sessions in order to automate logins to partner sites for file transfers.

Director makes it possible to automate all of the company’s file transfers with a schedule and log so we know the path and time of every transaction.  Alerts are automatically sent to us if there are any problems, or if we wish, every time it succeeds.  Responsibility can be distributed to various departments as needed to receive these alerts and/or to begin the execution of the transfers when ready.

The simple-to-navigate web interface makes it easy for any user to view, verify, change and execute these file transfers.  The scripting is easy for the average user to setup. If there are any challenges that we come up against with our file transfer processes, Linoma support has always been extremely effective at showing me how to do a successful execution.

I know how frustrating it can be to initiate, monitor, and track the ever increasing number of file transfers my company requires, especially without an all-in-one tool like managed file transfer.  It amazes me how many IT people still don’t realize there’s a better way to do things — a way that gives them more control, and more time to devote to all the other projects demanding their attention.  I know managed file transfer — and specifically GoAnywhere Director — changed my life at work.  I hope more of my IT colleagues discover the advantages soon.

Daniel Cheney

Daniel has been the IT Director at a healthcare company for the last 12 years and a longtime beneficiary of GoAnywhere Director and the IBM i platform. He is also a freelance writer for various technical and social media projects.

More Posts - Website

Managed File Transfer Streamlines HIPAA/HITECH Complexity

Posted by on Monday, 9 May, 2011

Managed File Transfer (MFT) systems are great for policy enforcement, access authentication, risk reduction, and more. But for HIPAA and HITECH requirements, MFT shines as a work-flow automation tool.

MFT as the B2B Enabler

It shines because Managed File Transfer systems are actually automation platforms that can help companies streamline the secure transfer of data between business partners. How? It removes many of the configuration steps traditionally required for complex Business-to-Business (B2B) processes, keeping it straightforward and manageable.

Transferring patient information is a difficult challenge which many healthcare institutions are facing. Data standards were supposed to simplify this communication between healthcare institutions and their partners. But ask any technical professional about the underlying variability of data formats, and you’ll hear a tale of potential confusion and complexity.

Nightmares of Compliance

The HITECH regulations within HIPAA require the security and privacy of healthcare records, strongly suggesting the use of data encryption. These records may travel between various healthcare-related partners including hospitals, clinics, payment processors and insurers. Each partner may require their own unique data format, and each may prefer a different encryption technique or transport protocol.

Considering these differing requirements, adding each new trading partner has traditionally needed the attention of in-house programming or manual processes, which has become hugely inefficient. Furthermore, if the new trading partner is not implemented properly, this can also create the potential for errors that may lead to data exposures. Any exposures could move the healthcare institution out of HIPAA/HITECH compliance and may cost them severely.

Simplifying and Integrating Information Transfer

A Managed File Transfer (MFT) solution can significantly reduce the potential for errors and automate those processes. With a good MFT solution, any authorized personnel should be able to quickly build transfer configurations for each healthcare business partner. This should allow for quick selection of strong encryption methods (e.g. Open PGP, SFTP, FTPS, HTTPS) based on the partner’s requirements, so that HITECH requirements are maintained. At the same time, a MFT solution creates a visible audit trail to ensure that compliance is sustained.

But, perhaps just as important, a good Managed File Transfer solution is constructed as a modular tool that can be easily integrated into existing software suites and workflow processes. In fact, a good MFT is like a plug-able transfer platform that brings the variability of all kinds of B2B communications under real management.

Now extend the MFT concept beyond the healthcare business sector, into manufacturing, finance, distribution, etc. Suddenly MFT isn’t a niche’ utility, but a productivity and automation tool that has myriad uses in multiple B2B environments.

A Day-to-day Technical Solution

Perhaps this is why the Gartner Group has identified Managed File Transfer as one of the key technologies that will propel businesses in the coming years. It’s more than just a utility suite: It’s a system that can be utilized over and over as an integral part of an organization’s solutions to automate and secure B2B relationships. In other words, MFT isn’t just for specialized compliance requirements, but a lynch-pin of efficient B2B communications technology that can bring real cost savings to every organization.

Healthcare Case Study Utilizing a MFT Solution: Bristol Hospital Takes No Risks with Sensitive Data

Thomas Stockwell

Thomas M. Stockwell is one of Linoma Software's subject matter experts and a top blogger in the industry. He is Principle Analyst at IT Incendiary, with more than 20 years of experience in IT as a Systems Analyst, Engineer, and IS Director.

More Posts - Website

Encrypting Files with OpenPGP

Posted by on Monday, 11 April, 2011

When our users send a file over the Internet there are really just a few things that seem important to them at the time:

a)      Is the file complete?

b)      Is it being sent to the right place?

c)      Will it arrive intact?

and — if the data is sensitive –

d)     Will the intended recipient (and only that recipient) be able to use it?

That’s where encryption comes in: By scrambling the data using one or more encryption algorithms, the sender of the file can feel confident that the data has been secured.

But what about the file’s recipient? Will she/he be able to decode the scrambled file?

Encryption, Decryption, and PGP

For years, PGP has been one of the most widely used technologies for encrypting and decrypting files. PGP stands for “Pretty Good Privacy” and it was developed in the early 1990s by Phillip Zimmerman. Today it is considered to be one of the safest cryptographic technologies for signing, encrypting and decrypting texts, e-mails, files, directories and even whole partitions to increase the security.

How PGP Works

PGP encryption employs a serial combination of hashing, data compression, symmetric-key cryptography, and, finally, public-key cryptography. Each step uses one of several supported algorithms. A resulting public key is bound to a user name and/or an e-mail address. Current versions of PGP employ both the original “Web of Trust” authentication method, and the X.509 specification of a hierarchical “Certificate Authority” method to ensure that only the right people can decode the encrypted files.

Why are these details important for you to know?

Growing Pains for PGP

PGP has gone through some significant growing pains – including a widely publicized criminal investigation by the U.S. Government. (Don’t worry! The Federal investigation was closed in 1996 after Zimmerman published the source code.)

One result of PGP’s growing pains has been the fragmentation of PGP: Earlier versions of the technology sometimes can not decode the more recent versions deployed within various software applications. This PGP versioning problem was exacerbated as the ownership of the PGP technology was handed off from one company to another over the last 20 years.

And yet, because PGP is such a powerful tool for ensuring privacy in data transmission, its use continues to spread far more quickly than other commercially owned encryption technologies.

Fragmentation and the Future of PGP

So how is the industry managing the issue of PGP fragmentation? The answer is the OpenPGP Alliance.

In January 2001, Zimmermann started the OpenPGP Alliance, establishing a Working Group of developers that are seeking the qualification of OpenPGP as an Internet Engineering Task Force (IETF) Internet Standard.

Why is this important to you? By establishing OpenPGP as an Internet Standard, fragmentation of the PGP technology can be charted and – to a large degree – controlled.

This means that the encrypted file destined for your system will be using a documented, standardized encryption technology that OpenPGP can be appropriately decrypted. The standardization helps ensure privacy, interoperability between different computing systems, and the charting of a clear path for securely interchanging data.

The OpenPGP Standard and Linoma Software

OpenPGP has now reached the second stage in the IETF’s four-step standards process, and is currently seeking draft standard status. (The standards document for OpenPGP is RFC4880.)

Linoma Software uses OpenPGP in its GoAnywhere Director Managed File Transfer solution. Just as importantly, Linoma Software is an active member of the OpenPGP Alliance, contributing to the processes that will ensure that OpenPGP becomes a documented IETF Internet Standard. This will ensure that your investment in Linoma’s GoAnywhere managed file transfer software remains current, relevant, and productive.

For more information about OpenPGP and the OpenPGP Alliance, go to http://www.openpgp.org. To better understand how OpenPGP can help your company secure its data transfers, check out Linoma Software’s GoAnywhere Director managed file transfer (MFT) solution.

Thomas Stockwell

Thomas M. Stockwell is one of Linoma Software's subject matter experts and a top blogger in the industry. He is Principle Analyst at IT Incendiary, with more than 20 years of experience in IT as a Systems Analyst, Engineer, and IS Director.

More Posts - Website

Who is Protecting Your Health Care Records?

Posted by on Monday, 7 March, 2011

Patient Privacy in JeopardyHealth Care Records

How important is a patient’s privacy? If your organization is a health care facility, the instinctive answer that comes to mind is “Very important!” After all, a patient’s privacy is the basis upon which the doctor/patient relationship is based. Right?

But the real answer, when it comes to patient data, may surprise you. According to a study released by the Ponemon Institute, “patient data is being unknowingly exposed until the patients themselves detect the breach.”

The independent study, entitled “Benchmark Study on Patient Privacy and Data Securitypublished in November of 2010 examined  the privacy and data protection policies of 65 health care organizations, in accordance with the mandated Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009. HITECH requires health care providers to provide stronger safeguards for patient data and to notify patients when their information has been breached.

Patient Data Protection Not a Priority?

According to the study, seventy percent of hospitals say that protecting patient data is not a top priority. Most at risk is billing information and medical records which are not being protected. More significantly, there is little or no oversight of the data itself, as patients are the first to detect breaches and end up notifying the health care facility themselves.

The study reports that most health care organizations do not have the staff or the technology to adequately protect their patients’ information. The majority (67 percent) say that they have fewer than two staff members dedicated to data protection management.

And perhaps because of this lack of resources, sixty percent of organizations in the study had more than two data breaches in the past two years, at a cost of almost $2M per organization. The estimated cost per year to our health care systems is over $6B.

This begs the question: Why?

HITECH Rules Fail to Ensure Protection

HITECH encourages health care organizations to move to Electronic Health Records (EHR) systems to help better secure patient data. And, indeed, the majority of those organizations in the studies (89 percent) said they have either fully implemented or planned soon to fully implement EHR. Yet the HITECH regulations to date do not seem to have diminished security breaches at all, and the Ponemon Institute’s study provides a sobering evaluation:

Despite the intent of these rules (HITECH), the majority (71 percent) of respondents do not believe these new federal regulations have significantly changed the management practices of patient records.

Unintentional Actions – The Primary Cause of Breaches

According to the report, the primary causes of data loss or theft were unintentional employee action (52 percent), lost or stolen computing device (41 percent) and third-party mistakes (34 percent).

Indeed, it would seem that – with the use of EHR systems – technologies should be deployed to assist in these unintentional breaches. And while 85 percent believe they do comply with the loose legal privacy requirements of HIPAA, only 10 percent are confident that they are able to protect patient information when used by outsourcers and cloud computing providers. More significantly, only 23 percent of respondents believed they were capable of curtailing physical access to data storage devices and severs.

The study lists 20 commonly used technology methodologies encouraged by HITECH and deployed by these institutions, including firewalls, intrusion prevention systems, monitoring systems, and encryption. The confidence these institutions feel in these technologies are also listed. Firewalls are the top choice for both data breach prevention and compliance with HIPAA. Also popular for accomplishing both are access governance systems and privileged user management. Respondents favor anti-virus and anti-malware for data breach prevention and for compliance with HIPAA they favor encryption for data at rest.

The Value of Encryption

The study points to the value of encryption technologies – for both compliance purposes and for the prevention of unintended disclosure – and this value is perceived as particularly high by those who participated in the study: 72 percent see it as a necessary technology for compliance, even though only 60 percent are currently deploying it for data breach prevention. These identified needs for encryption falls just behind the use of firewalls (78 percent), and the requirements of access governance (73 percent).

Encryption for data-at-rest is one of the key technologies that HITECH specifically identifies: An encrypted file can not be accidentally examined without the appropriate credentials. In addition, some encryption packages, such as Linoma’s Crypto Complete, monitor and record when and by whom data has been examined. These safeguards permit IT security to audit the use of data to ensure that – should a intrusion breach occur – the scope and seriousness of the breach can be assessed quickly and confidently.

So how important is a patient’s privacy? We believe it’s vitally important. And this report from the Ponemon Institute should make good reading to help your organization come to terms with the growing epidemic of security breaches.

Read how Bristol Hospital utilizes GoAnywhere Director to secure sensitive data.

Thomas Stockwell

Thomas M. Stockwell is one of Linoma Software's subject matter experts and a top blogger in the industry. He is Principle Analyst at IT Incendiary, with more than 20 years of experience in IT as a Systems Analyst, Engineer, and IS Director.

More Posts - Website

1.800.949.4696  |  sales@linomasoftware.com  |  privacy policy
Copyright ©1994 - 2012 Linoma Software  |  All rights reserved