Tuesday, June 23, 2009

INFOSEC Program - My academia overview presentation

Information Security Program high-level slides
A repeatable process and customizable/adaptable for any organization.








Friday, June 19, 2009

Enterprise Risk Conference - campIT

The conference included presentations and panel discussions; as well as voluntary items from the audience—framework for a repeatable risk assessment, how to involve business and get executive level understanding, end point security for wireless, budget cuts-how to distribute load and a governance model for best practices, collaboration of security risks; and data ownership or classification.


Some key takeaways included:
  • The Malware and threat landscape has seen exponential growth particularly over the last couple years, thus, risk becomes proportionally larger, requiring more resources and time/effort to just reactively combat
  • Five steps to managing IT risk are: risk awareness, business impact quantification, solution design, IT and business alignment, and then build & management of solution
  • The basis for information security is policy documentation and implementation that leverage the organization’s culture, identification of stakeholders, leveraging regulatory and compliance standards, and also partnering with Internal and External auditors
  • Metrics that matter resides with an automated and repeatable metric (re)produceable as well as owner identification. With that, smarter decisions can be made, regulatory readiness can be conveyed, measure of effectiveness of risk management, and visibility into deviation and weaknesses. The consensus was free tools (as a starting point) can be your jumping off point such as Splunk
  • Cloud computing does have advantages for certain situations such as anti-virus solution to leverage the general communities security vulnerabilities identification and resonse; but SIM/SIEM solution would not be preferred since you’ll be pushing very large (log) data into the cloud; and, cost reduction may not always be the case when factoring in the security control and visibility to your information and/or traffic
  • Security gaps made known to auditors can advance any agenda; and partnership with the business and particular financial departments will propel the risk management framework and practice [getting them to really understand risk in dollars & productivity means funding]
You’ve just been enlightened...at least to a small degree so future post will consist a deeper drive on some of these key topics. But contemplate this security analogy [as mentioned in the presentation], how good would your security be if a member did not adhere to policies?
I caught on to an analogy that would be perfect for a security awareness article. A group of individuals on a boat where one person decides to drill a hole under his own seat...thereby allowing water to start seeping into the boat. While this individual may be working within one's own confinements, the end result is clear--the boat sinks with everyone on board. Of course, the real world is just that simple (to recoginze or for that matter identify/remediate), right? So awareness, business integration and appropriate sponsorship is the important trifecta of any successful security practice and implementation (as in the well known people, process, and technology).

To conclude, the vendors/sponsors provided trinkets as usual and noteworthy conversation from colleagues that run the same circles (yet varying sectors). But I must note one item that I found to be contradictive to my own experiences (at least in today's environment). That is the BITS shared assessment movement and current industry acceptance. While the overall premise is sound, I would disagree that an organization having completed the exercise would merely present the relative sections in response to an external audit and that to suffice. Has the industry really accepted a single version of a questionnaire? What about customized business and technical parameters and controls that needs to be asked/answered? While I would asert that providing a SAS 70 Type II saves you some effort in having to answer all components, I dought the industry as a whole is ready for a single, all emcomosing questionnaire. I can't wait for that time (but how would risk assessors make that extra cash).

Tuesday, June 16, 2009

Mobile phone privacy

Coming June 18th, Connectivity will be offering a directory phone service listing of 15 million UK mobile phones (nearly 1/3 of the population). Apparently these numbers were freely obtained during normal course of business or through surveys which folks provided their mobile numbers. For a $1.20 charge, you can subscribe to a voice automated service that will connect you to these mobile device via supplying name and town; and an option to leave a voice mail in absence of the call connecting (yet claims that the actual number will not be supplied).

Of course, a corresponding SMS web service will be offered as well—with call back for matches.
Oh; but there is an opt-out option though it may take up to a month to enact. Is anything safe from spam or unwanted calls anymore? Wonder how they will certify teenagers/minors using mobile devices are protected?
A former consultant of Connectivity who is now with Privacy International has expressed concern about the way in which these numbers were collected and is now destined to be used.

The point being don’t provide any more information than you absolutely have too even if they say it’s “required” and certainly don’t volunteer any data in surveys or otherwise.

This is sure to add concerns to the already threatened and often violated Data Protection Act...which some numbers show about 18% of company are not sure if they have illegally release personal information to third parties and/or failed to information securely.

The 8 principles of DPA from http://www.ee.ic.ac.uk/dpa/principles.html:
When processing personal information the following 8 principles must be complied with and data must:
1. be obtained and processed fairly and lawfully and shall not be processed unless certain conditions are met.
2. be obtained for a specified and lawful purpose and shall not be processed in any manner incompatible with that purpose
3. be adequate, relevant and not excessive for those purposes
4. be accurate and kept up to date
5. not be kept for longer than is necessary for that purpose
6. be kept safe from unauthorised access, accidental loss or destruction
7. not be transferred to a country outside the European Economic Area, unless that country has equivalent levels of protection for personal data.

So, here’s a site that claims to do the opt-outs Telephone Preference Service (TPS)

Monday, June 8, 2009

T-mobile hacked again?

Apparently an anonymous hacker posted information related to T-mobile servers on Saturday and claimed that they had customer confidential information and financial records and proprietary operating data…then set out to put the information to the highest bidder. The wireless giant is once again in the news…in 2005 Nicholas Jacobsen was charged with unauthorized access to their network when a U.S. Secret Service agent uncovered he had 16 Million U.S. subscribers data. He later pleaded guilty to a single felony charge of intentionally access protected computer and recklessly causing damages (spanning over 2 years worth). Since then, in 2006 SSN of about 45,000 customers were lost and in 2008 disc lost of 17 Million customers.


Coincidentally, Deutsche Telekom, the T-Mobile parent, wants to dump the business based on its recent profit warning citing a 21% down in UK revenue and a weakness in the U.S. But then again their preparing for 4G with download capability of 7.2Mbps—so you can hack at light speed ;)
Related incidents? The hack just a hoax? Perhaps, but what brand damage has occurred already? What compliance or regulatory efforts have they not complied with; or, better yet, compliant and still breached? Think breached customer will get their notification any faster now, compensating for your personal data lost?
Another story sure to unfold

Wednesday, June 3, 2009

PCI lawsuit

In the case Merrick Bank Corporation v. Savvis, Inc., the bank is taking on the QSA firm who certified the processor, CardSystems Solutions Inc.


It was only a matter of time...about 3 years ago, large CPA firms (including the practice I once lead) dropped out of this boutique certification service offering. Can you say smart move, or, just a keen sense of detailed analysis--considering how do you really attest to all the systems (holding card numbers) are compliant even at a point-in-time when you could have hundreds of POS (Point of Sales) and backend servers and networks internetworked. We all have SAS 70’s for what they’re worth but it’s an assurance of the controls, chosen by the company being audited and the tested control as prescribed. And, yes, auditing is about sampling; and due diligence as defined in Sarbanes-Oxley; but PCI pledges certification then post the list of company on a website.

The past meets the present: In 2005, 40 million credit cards of all brands exposed by the payment-card processor was a result of a vulnerability in the processor’s card systems—resulting, among many other things, a ball park figure of $16 Million incurred by Merrick Bank (an aquiring bank of about 125,000 merchants). So, four years later, Merrick [and I must add they have always been on top of the PCI requirements and due diligence from a best practices as well as contractual obligation] has filed a lawsuit with Savvis for negligence regarding their audit of CardSystems who was Visa Cardholder Information Security Program (CISP); predecessor to DSS and ROC as we know today.

Of course, proof means everything and good lawyers can be very convincing but the certification process is sure to be analyzed along with the jurisdictionintent, and actual Negligence andNegligent Misrepresentation (Count 1 and 2, respectively) at the time of incident.

I talked about downstream impact before so what does this actually mean to all of us?
  • Immediate extensive scrutiny and testing by QSAs prior to issuing ROCs
  • More man hours (and additional charges) for any PCI assessment and audit leading to certification
  • But the real news is perhaps PCI will issue agressive standards; not just clarification still subject to QSA interpretations
Industry standards, certifications , and regulatory/governmental compliance is headed toward accountability not just toward the company but the individuals asserting compliancy. Like executives for financial reporting; and HIPAA now for Business Associates, but will Auditors be assigned responsibility (when they are merely testing the controls)?

Definitely a story to follow from the U.S. District Court of Eastern District of Missouri which will continue to evolve the information security and legal synergy....and further support company spend on IT for the "sake" of PCI.

Monday, June 1, 2009

Web Application Firewall (layered security) optional requirement

spin off of my most recent post...

I think we all can conclude that (all things considered) code review is the best method, either or both static analysis at compile-time or dynamic analysis at actual run-time. But what about if an immediate (without necessarily monkeying with the code) is required; and in consideration of long term strategy... That’s when WAF might just be most appropriate. The trade off is when WAF actually needs to be put in BLOCKING mode since all know vulnerabilities and fixes have been applied; thus, no business interrupt would be noticed from the application.

Alright, I’m not dispelling the importance/benefit/necessity of code review at all but let’s take a quick peak solely at WAF and Dynamic Profiling technology. The theory is a behavior based approach; unlike a static/manual fix code review under application normal cycles. While developers can write code to protect against vulnerabilities and present day attack methods, it will not necessarily secure the application for tomorrow’s threats. Gee-whiz am I going off on an IDS/IPS and vulnerability/PEN testing tangent?
The premise is alike in that understanding the path/source leads to better code reviewing or better detection.

As it turns out selecting a WAF is much like selecting an IPS vendor (for in-line as well as out-of-band monitoring) yet less like a firewall product. So, here’s what you need to know:
Throughput and latency [actually how much, not if] is on the top of requirements; but deployment flexibility including software or appliance-based; and in virtual hosting environments plays particular importance. WAF’s runs on the following modes:
Passive – for just listening
Bridge – sending TCP resets when malicious activity is detected
Router – like server not really recommended as a routing function
Reverse-proxy – most common operating at Layer 4-7
Embedded – within the application (typically for very small/non-complex deployment)

So, look into SSL accelerators and webcache integration features or HTML compression other than sound-but-simple policy rules.
But the squeeze spot on attack characteristics that any WAF’s should be able to tackle include: input validation and invalid request, injection flaws, buffer overflows, cross site scripting (GET and POST), broken authentication, cookie poisoning, forceful browsing, parameter tampering, and of course SQL injection.

This is particularly significant if the WAF is asked to (hopefully in limited cases where packets are encrypted over HTTPS/SSL) inspect packers SSL termination and decryption is a option often enabled. Remember, a good WAF architecture will also consider both REQUEST and REPLY vulnerabilities.

Conclusion, WAF is complementary to all other application security practices/processes… PCI 6.6 actually puts the two in comparison of a single solution when in fact, layered defense would tell you otherwise. So, I would speculate that once the industry adapts one of the solution [WAF or code review] it won’t be long till the other is also a requirement (not just best practice). Oh, and if you thought IDS/IPS implementation was a cake-walk, you’re in for a treat—but do purge through.