Sunday, December 19, 2010
Saturday, November 27, 2010
Information Assurance in Estonia
David Haydter
After MSIA graduate David Haydter returned to his current home in Virginia after Residency 2010, he mentioned in an email that he’d received an award for his IT/IA work in Estonia. We asked him for details, and here’s his story.
As for the award, there were three parts to it. As the Information Management Officer (IMO) at American Embassy Tallinn, Estonia, I supervised six people and was ultimately responsible for all IT, IT security, and communications at the Embassy. The Department of State uses software to monitor all its subnets (over 400 of them), collect metrics, and to determine the overall network risk and health of each subnet. Tallinn consistently ranked number one out of all the Department's domestic offices and foreign missions for its network risk score. This means our patches are applied immediately, our virus definitions are up to date, our security templates are current, permissions, user accounts, registry settings, etc. are all exactly where they should be. My staff deserves the credit, and I wrote them a separate award which was presented by our Ambassador.
The second part was for the installation of a new messaging (aka telegrams or cables) system which replaced our legacy system from the early 1990s. Embassy Tallinn was the only mission to successfully install, learn and administer the application, and train its users without flying in outside support. The third part of the award was not IT or IA related. The award was a Superior Honor Award which had to be approved in Washington (as opposed to other awards which can be approved at the Embassy level) and signed by an assistant secretary. Normally, it would then be presented by our Ambassador. In this case, Secretary Clinton was in town for a NATO summit, and the Ambassador arranged for her to present it to me.
I was in Estonia for three years from 2007-2010. It is a beautiful country and has come a long way since the fall of communism. It sits against the Baltic Sea only 50 or so kilometers South of Finland and has a magical old town full of medieval castles. The winters are very cold, but I know Vermont gets its fair share too! Being so far north, the days are extremely short in the winter time (only about five hours of daylight) but long in the summer. The sun doesn't set until around 11:00 p.m. in June and July and it doesn't get completely dark. The song festival happens every five years - it just happened this year and was a great experience. [ed. Learn more at http://www.estemb.org/estonia/estonian_song_and_dance_festival ]. There's also a lot of knitting. There are several shops, mostly in old town, that sell knitted items to the many tourists that come to town.
There was no base - just the Embassy in the middle of the city. We lived amongst the Estonians and were immersed in their culture. Very few State Department Missions (Embassies, consulates, U.S. Missions) have a base-type setup. Our neighbors are typically citizens of the host nation, and we eat, shop and play just as they do. This is what makes the Foreign Service so interesting.
Last Week’s Quiz Question
In what year was the Ticonderoga moved to the Shelburne Museum?
Answer: 1955
Winner: William R. Lampe
This Week’s Quiz Question
In what year did Norwich University start the first Civil Engineering program in the US?
Past winners |
Andrey N. Chernyaev: 5 wins |
Bill Lampe: 4 wins Matt Bambrick: 3 wins |
Dianne Tarpy: 2 wins |
Scott Madden: 2 wins |
Sam Moore |
Autumn Crossett |
Gil Varney, Jr. |
Glen Calvo |
Thomas Reardon |
Sherryl Fraser |
Srinivas Chandrasekar |
Marc Ariano |
Linda Rosa |
Joanna D'Aquanni |
Srinivas Bedre |
Christian Sandy |
Joseph Puchalski |
Ken Desforges |
William Westwater |
Saturday, November 6, 2010
Business Continuity—Fact or Fiction?
John Orlando, Program Director
I’ve always been bothered by the nagging suspicion that the body of knowledge in the business continuity field is more fiction than fact. Let me explain.
Think about people’s option on the shape of the world. For most of human history people believed that the world was flat. The view that the world was round didn’t come into vogue until relatively recently (OK, technically it’s more of a sphere, but you get the idea).
We consider the change in belief from flat-earth to round-earth to constitute a move from ignorance to insight. This is because we consider the belief that the world is round(ish) to better match the reality of the shape of the world than the belief that the world is flat. Science is “world-guided” in this way—the objects that it studies are out there in the world--and so there is a fact of the matter against which its beliefs can be judged.
This does not mean that all scientific beliefs are true. In fact, most scientific beliefs eventually turn out to be false. Reference the aforementioned belief that the world was flat. The fact that everyone believed that the world was flat at one time did not make the world flat, it just meant that a lot of people were wrong about the shape of the world. The truth or falsity of scientific beliefs is not determined by how many people who hold them, but rather by their match with reality.
But can the same be said for business continuity beliefs? For instance, we are told that there are four parts to the emergency management process--mitigation, preparedness, response and recovery. But does the four part division represent a real division in the emergency management process, or just an arbitrary categorization? Does the emergency management process naturally fall into four parts, rather than five or three, or is it more like the birthday cake that is cut into eight parts because eight people happened to have shown up at the party? Is there a sense in which we can say that someone who believes that the emergency management process has three, or five, steps is wrong, or are they just not buying into the categorization system that everyone else adopts?
Now, even if business continuity concepts are created, rather than discovered, they would still have value. Conventions facilitate discussion by giving everyone the same language. But the absence of solid evidence supporting business continuity practice hampers the profession’s growth. Years ago it was predicted that insurance companies would provide discounts to organizations with business continuity plans, thus creating incentives for developing continuity plans (and to hire continuity practitioners), but that did not materialize because insurance rates are dictated by actuarial tables. An insurance company might provide you with a discount on your policy for not smoking because statistics show that by not smoking your total healthcare outlays will be X% lower than if you smoked, and they can pass the savings on to you. But no similar statistics exist for business continuity programs.
Lacking evidence to support its practices, the BC profession is primarily intuition-driven rather than evidence-driven. While intuition is not a bad thing if faced with a lack of evidence—after all, it’s all that you’ve got to go on--intuition can be misleading. For instance, at one time doctors treated illnesses by the accepted method of blood-letting. We may laugh today, but at the time this made more intuitive sense in their world-view than the belief in germs that were too small to see. It was only after doctors actually compared the recovery rates of people treated with bloodletting with those treated without it that they discovered, much to their surprise, that this method didn’t work so well after all.
I don’t think that business continuity practice is on par with blood-letting (at least I hope not), but proving that to others is another thing. If we can’t provide the evidence to prove that business continuity is a world-guided science, then it becomes hard to justify the expertise of practitioners to the outside world.
This is why we are developing the Norwich University Business Continuity Research Institute. The institute will sponsor research that puts the business continuity field onto a firm foundation. Some of the research will compare different methodologies to determine which works the best. Other research will try to simply establish that business continuity programs to pay off for an organization in the long run.
Our plan is to contract with faculty, students, alumni to do much of the research, but also to commission outside investigators. We will disseminate the results through face-to-face talks, online classes, social media, etc.
Mostly we want to foster an atmosphere of open and critical investigation that will impart the foundations of evidence and reasoning to common beliefs of the field, and expose where those beliefs are mere conventions without firm backing.
The goal is to put business continuity on par with other accepted professions in society. Only then will the field gain legitimacy to the outside world.
Please take this as an open invitation to join us in this journey.
Last week’s Quiz Question
Question: In what year did the current Norwich University library open?
Answer: The Kreitzberg Library opened in 1993.
Winner: William R. Lampe
This week’s quiz question
In what year was the Ticonderoga moved to the Shelburne Museum?
Past winners |
Andrey N. Chernyaev: 5 wins |
Matt Bambrick: 3 wins |
Bill Lampe: 3 wins |
Dianne Tarpy: 2 wins |
Scott Madden: 2 wins |
Sam Moore |
Autumn Crossett |
Gil Varney, Jr. |
Glen Calvo |
Thomas Reardon |
Sherryl Fraser |
Srinivas Chandrasekar |
Marc Ariano |
Linda Rosa |
Joanna D'Aquanni |
Srinivas Bedre |
Christian Sandy |
Joseph Puchalski |
Ken Desforges |
William Westwater |
Wednesday, October 20, 2010
Why do we make poor risk-based decisions?
By Donald Holden, CISSP-ISSMP
Everybody makes decision about accepting risks, often without knowing the actual probability of an adverse outcome. In business, we talk about security being concerned with the management of risk. When we make critical decisions that affect the privacy, safety, or security of individuals or corporate assets, do we base these decisions on objectively determined probabilities or upon our perception of risk and rewards, in other words our gut feel? Our perception of risk can change when the adverse impact does not happen even though the actual probability has not changed. People who have been warned to evacuate due to a hurricane that then misses them, tend to minimize the risk of future hurricane warnings. Or if you own a house, have you installed a burglar and fire alarm? If so did you do so after a break-in or fire? The probability of a future break-in is the same before and after the event that may have caused you to install an alarm. Two academic studies provide valuable insight into how we react to perceived risk rather than calculated or statistical risk.
The first study analyzed how we lower our perceived risk after we have successfully avoided a near-miss or had a close call. A near-miss is defined as an “event that could have been a failure but for luck or chance.” The second study looked at how we postpone mitigating known critical risks based on short term optimizing goals. Or in the words of the authors, we have a “psychological bias toward short-term maximization instead of long-term planning—a psychological bias all humans share.” Understanding and then overcoming these human biases in decision making is necessary to improving the safety and security of our corporate and personal environment.
The first study concerning near-misses was published in Management Science[1] and discussed in the McDonough School of Business (Georgetown) Magazine[2]. Two professors, Robin Dillon-Merrill and Catherine Tinsley, from Georgetown’s McDonough School of Business looked at the impact of near-misses on how we make decisions using perceived risk rather than calculated or statistical risk. They looked at how the near-misses in the American Space Shuttle Program led to the Columbia Space Shuttle catastrophe. Falling foam insulation during previous space shuttle had fallen during lift off; some had struck the heat tiles but caused no major damage. There had been a calculation of the risk that foam insulation could cause damage to tiles but the experience with near-misses affected the decisions that led to the Space Shuttle Columbia catastrophe in 2003 where the foam did cause major damage to the tiles. Basically, the professors’ research and subsequent experiments with people playing a simulation with near-misses showed that when people have a near-miss, they see it as successfully avoiding the adverse impact rather than a near failure that could happen again with an adverse outcome. Surprisingly, the researchers found that the participants in the experiments did not actually believe that there was a reduction in the calculated or statistical probability of the adverse impact. It was their perception of risk that was lowered; this was an emotional not a rational reaction to the near-misses.
In an article Masters of Disasters[3] two professors at Wharton’s Risk Management and Decision Processes Center ran a computer simulation where participants are told that they have a house and a bank account with $20,000 which pays 10% interest and they were given a warning that an earthquake could occur at any moment and 3 to 5 mild to severe quakes will happen during the game. Players could spend money on structural improvements to the house or continue to earn interest on the money left in the bank. Although initially players spent some money on improvements, they postponed major improvements thinking that a severe quake would not happen in the next few minutes of the games. By taking these risks, all players lost everything when a quake did happen. Their initial perception of risk seems to change based on the non-occurrence of the quake and the opportunity to earn interest. The Quake players which included both students and then corporate executives found “a sense of security from observing the flimsiness of one another’s houses. If everyone around you has a house of straw, having a straw house yourself seems somehow safer.” Does this sound similar to how businesses look at security risks?
The authors of the near-miss study recommend treating the near-miss events not as successes but as failures using counterfactual thinking; that is, imagining how an outcome could have turned out differently, and how the antecedents that led to the event might have been different. Just as we try to learn from our mistakes and failures, we need to learn from the near-misses before they become failures. We should recognize that our experience with near-misses usually causes us to reduce our perception of risk. When we combine this tendency with our preferences for short term gain at the expense of longer term impacts, we can see how a range of risk-based decisions are more emotional than rational. Recognizing this tendency in ourselves and others can help us make better and more rational risk-based decisions that affect safety and security.
Last week’s quiz question
What fact about South Hall makes it unique among Norwich University buildings?
Answer: It is LEED certified as a “green” building.
You can learn what this means at:
Winner: William Westwater
This week’s quiz question
In what year did the current Norwich University library open?
Past winners Andrey N. Chernyaev: 5 wins Matt Bambrick: 3 wins |
Dianne Tarpy: 2 wins Bill Lampe: 2 wins Scott Madden: 2 wins |
Sam Moore |
Autumn Crossett |
Gil Varney, Jr. |
Glen Calvo |
Thomas Reardon |
Sherryl Fraser |
Srinivas Chandrasekar |
Marc Ariano |
Linda Rosa |
Joanna D'Aquanni |
Srinivas Bedre Christian Sandy Joseph Puchalski Ken Desforges William Westwater |
Monday, September 27, 2010
Health Providers Beware of the New HITECH Act
Tim Trow, MSIA student
The Health Information Technology for Economic and Clinical Health Act, or more commonly known as the HITECH Act, is part of the American Recovery and Reinvestment Act of 2009. This act appears to put some teeth ino the HIPAA regulation of 1996. The HITECH Act wants to provide some general and specific incentives for companies to adopt the electronic health record (EHR) systems for health organizations. With these incentives also comes greater increased privacy and security protections for consumers and potential increased liability for those that are not in compliance.
There are three main components to the new HITECH Act. They include:
1. Enforcement: Civil penalties have been increased under the new act. These penalties can exceed $250,000, with repeatable violations extending to $1.5 million. The new act also allows a state attorney general to bring an action on behalf of his or her residents. Also, HHS is now required to conduct periodic audits of covered entities and business associates.
2. Notification of breach: HITECH now imposes data breach notification requirements for unauthorized uses and disclosures of PHI. These are similar to the existing state data breach laws. This outlines the importance of this new act and how it is going to react to privacy and security concerns in regards to protection and reporting of known breaches of PII.
3. Business associates: Under the HITECH Act, business associates are now directly "on the compliance hook" since they are required to comply with the safeguards contained in the Security Rule. Most software vendors providing EHR systems will most likely qualify as business associates.
Companies and health providers should take a serious look at their current status in regards to HIPAA and more specifically around the new HITECH Act. There are some great incentives for health organizations that decide to comply with the new HITECH Act. Health providers can start by performing a Gap assessment of their current environment in relation to HIPAA regulations and the HITECH Act. A Gap assessment will provide a roadmap to address any deficiencies and should also include an evaluation of the current information security program that should address the three key components outlined above. A third-party, business associate program should be outlined to address and manage your key business partners. In addition, a formal data breach policy and process needs to be developed and supported by the organization’s leadership team. Lastly, legal and executive management need to understand the consequences and risk associated with not complying with HIPAA and the new HITECH Act.
1. Enforcement: Civil penalties have been increased under the new act. These penalties can exceed $250,000, with repeatable violations extending to $1.5 million. The new act also allows a state attorney general to bring an action on behalf of his or her residents. Also, HHS is now required to conduct periodic audits of covered entities and business associates.
2. Notification of breach: HITECH now imposes data breach notification requirements for unauthorized uses and disclosures of PHI. These are similar to the existing state data breach laws. This outlines the importance of this new act and how it is going to react to privacy and security concerns in regards to protection and reporting of known breaches of PII.
3. Business associates: Under the HITECH Act, business associates are now directly "on the compliance hook" since they are required to comply with the safeguards contained in the Security Rule. Most software vendors providing EHR systems will most likely qualify as business associates.
Companies and health providers should take a serious look at their current status in regards to HIPAA and more specifically around the new HITECH Act. There are some great incentives for health organizations that decide to comply with the new HITECH Act. Health providers can start by performing a Gap assessment of their current environment in relation to HIPAA regulations and the HITECH Act. A Gap assessment will provide a roadmap to address any deficiencies and should also include an evaluation of the current information security program that should address the three key components outlined above. A third-party, business associate program should be outlined to address and manage your key business partners. In addition, a formal data breach policy and process needs to be developed and supported by the organization’s leadership team. Lastly, legal and executive management need to understand the consequences and risk associated with not complying with HIPAA and the new HITECH Act.
Last week’s Quiz Question
Question: What is the statue of on the top of the Vermont state capitol dome?
Answer: Agriculture (or Ceres)
Winner: Scott Madden
This week’s quiz question
What fact about South Hall makes it unique among Norwich University buildings?
Past winners Andrey N. Chernyaev: 5 wins Matt Bambrick: 3 wins |
Dianne Tarpy: 2 wins Bill Lampe: 2 wins Scott Madden: 2 wins |
Sam Moore |
Autumn Crossett |
Gil Varney, Jr. |
Glen Calvo |
Thomas Reardon |
Sherryl Fraser |
Srinivas Chandrasekar |
Marc Ariano |
Linda Rosa |
Joanna D'Aquanni |
Srinivas Bedre Christian Sandy Joseph Puchalski Ken Desforges |
Subscribe to:
Posts (Atom)