Archive for the ‘DLP’ Category

What is the True Cost of a Data Breach?

Wednesday, May 14th, 2014

This week, TITUS released an infographic that contains some sobering figures about the true cost of a data security breach. While lost data can mean lost intellectual property (which is hard to place an accurate value on), it most certainly would include fines, expensive customer communications, lawsuits, and an evaluation of technology and/or policy.  All totaled, the Ponemon Institute’s 2014 Cost of a Data Breach Study pegs the cost of a lost record in the US at $195 per record – up from $188 in 2013.
(more…)

The Evolution of Classification

Wednesday, October 23rd, 2013

Last week, the lead whitepaper in TechTarget’s Daily Top 5 was titled, How to Tackle Information Classification – published by the Jericho Forum. Naturally, I was interested to see what it had to say and eagerly downloaded it only to find that it was originally published in January 2009 – almost 5 years ago. Despite its age, the whitepaper is a solid introduction to information classification, the benefits and the challenges. In particular, it provides confirmation that classification is the lynchpin to successful security in a “de-perimeterised environment.” But there were a few areas where it was a bit, shall we say, “stale.” The Jericho Forum whitepaper identified some problems which, in the years since it was published, have been successfully addressed.

Let’s look at the three main problems areas that the Jericho Forum whitepaper identified: (more…)

Cloud Data Security…Are You Worried about the Cost?

Friday, February 24th, 2012

According to a survey by research group the Ponemon Institute, recently sited in an InformationWeek article, 91% of federal IT workers are either somewhat or very familiar with the Office of Management and Budget’s Cloud First initiative, however 69% believe that the initiative’s requirement to move three services to the cloud over 18 months is too fast. In fact, 71% of respondents said that pressure to move to the cloud creates security risks for their organizations.

The government’s move to cloud computing throughout the Obama administration is moving forward. Numerous efforts, including Cloud First and the FedRAMP security authorization initiative, have been set up to help accelerate that move. And at TITUS we are working with various government agencies to better understand their cloud data security requirements.

(more…)

Why Isn’t My DLP Investment Paying Off?

Wednesday, January 4th, 2012

It’s a common scenario: a large organization invests millions of dollars in a DLP solution, only to leave it in “watch mode” because the rate of false positives is too high to enable full blocking. The result is a DLP investment that becomes a white elephant: a promising technology that does not pay off in actually preventing data loss.

The problem often begins with an over-reliance on automated scanning to prevent data loss. The DLP system is expected to automatically identify all sensitive content, which requires IT administrators to translate business processes and policies into automated rules for every data loss scenario. This is an impossible task, which usually results in overly restrictive rules that block non-sensitive data (false positives) or overly permissive rules that mistakenly release sensitive data (false negatives).

The impact of false positives can be just as detrimental to the business as the data loss caused by false negatives. False positives disrupt business agility and productivity, and can impact collaboration, innovation, and business growth. As well, false positives can actually lead to increased data loss, with users looking for alternative, less secure methods to get around restrictions and carry out their business tasks.

The best way to address this problem is for organizations to identify their information appropriately. The sensitivity of each piece of information must be identified, or ‘classified’. Information classification is crucial for proper handling, and for the ultimate security of an enterprise’s information. Classification provides context to unstructured data such as email and business documents, making it possible for DLP solutions to know how to protect your organization’s sensitive information. (more…)

Top Data Security Blog Posts for 2011: Data Classification, Mobile Security, Data Security and Compliance, Data Loss Prevention, and Cloud Data Security

Wednesday, December 28th, 2011

As 2011 draws to a close, I thought it would be interesting to provide a list of the most popular data security articles on this blog. Here are the topics and articles that were most popular with our readers:

1) Data Classification

More and more commercial organizations have started to see data classification as the foundation of their information protection strategy. We wrote several articles about this trend, including an article that described how to implement a data classification policy in 5 simple steps, and an article that recommended best practices for defining a data classification scheme. Readers were also interested in how to use classification software to bulk classify, mark, and label large numbers of files.

2) Mobile Security

Mobile security has become a hot topic, especially with the trend toward consumerization of mobile devices. (more…)