Use this discussion area to explore topics around ILM2.0-based practices and post blogs, important information, opinion, or even to pose questions to discuss. Use the "Ask the Experts" discussion board to explore problems and solutions.
"ITIL v3 or ITSM don't go far enough.
They don't give us a practical methodology for implementation.
Service management with ILM2.0 based practices fills that gap. "
Source: Bob Rogers, 2010
10 November, 2010
By Mark Cox
Security must evolve to support organizations' transition from virtualized data centers to private cloud computing infrastructures, according to analyst firm Gartner. While the fundamental principles of information security will remain the same, they say the way by which organizations provision and deliver security services must change. Gartner predicts that by 2015, 40 percent of the security controls used within enterprise data centers will be virtualized, up from less than 5 percent in 2010.
"For most organizations, virtualization will provide the foundation and the steppingstone for the evolution to private cloud computing," said Thomas Bittman, vice president and distinguished analyst at Gartner. "However, the need for security must not be overlooked or 'bolted on' later during the transition to private cloud computing."
Bittman explained that whether securing physical data centers, virtualized data centers or private clouds, the fundamental tenets of information security - ensuring the confidentiality, integrity, authenticity, access, and audit of our information and workloads - don't change. There will however, be significant changes required in how security is delivered. Whether supporting private cloud computing, public cloud computing, or both, security must become adaptive to support a model where workloads are decoupled from the physical hardware underneath and dynamically allocated to a fabric of computing resources.
"Policies tied to physical attributes, such as the server, Internet Protocol (IP) address, Media Access Control (MAC) address or where physical host separation is used to provide isolation, break down with private cloud computing," said Neil MacDonald, vice president and Gartner Fellow. "For many organizations, the virtualization of security controls will provide the foundation to secure private cloud infrastructures, but alone, it will not be enough to create a secure private cloud."
To support secure private cloud computing, security must include the following characteristics. It must be an integral, but separately configurable part of the private cloud fabric, designed as a set of on-demand, elastic and programmable services, configured by policies tied to logical attributes to create adaptive trust zones capable of separating multiple tenants. These are, MacDonald explained, the six necessary attributes of private cloud security infrastructure:
A Set of On-Demand and Elastic Services
Rather than security being delivered as a set of siloed security product offerings embodied within physical appliances, it needs to be delivered as a set of services available 'on demand' to protect workloads and information when and where they are needed. These services need to be integrated into the private cloud provisioning and management processes, and be made available to any type of workload -- server or desktop. As workloads are provisioned, moved, modified, cloned and ultimately retired, the appropriate security policy would be associated with the workload throughout its life cycle.
The security infrastructure that supplies the security services must become 'programmable' -- meaning that the services are exposed for programmatic access. By definition, private and public cloud-computing infrastructure is consumable using Internet-based standards. In the case of programmable security infrastructure, the services are typically exposed using RESTful (Open representational state transfer] APIs, which are programming language and framework independent. By exposing security services via APIs, the security policy enforcement point infrastructure becomes programmable from policy administration and policy decision points. This shift will enable information security professionals to focus their attention on managing policies, not programming infrastructure.
Policies That Are Based on Logical, Not Physical, Attributes and Are Capable of Incorporating Runtime Context Into Real-Time Security Decisions
The nature of the security policies that drive the automated configuration of the programmable infrastructure needs to change as well. As organizations move to virtualized data centers and then to private cloud infrastructure, increasingly, security policies need to be tied to logical, not physical, attributes. The decoupling and abstraction of the entire IT stack and movement to private and public cloud-computing models means that workloads and information will no longer be tied to specific devices, fixed IP or MAC addresses, breaking static security policies based on physical attributes. To enable faster and more-accurate assessments of whether a given action should be allowed or denied, more real-time context information must also be incorporated at the time a security decision is made.
Adaptive Trust Zones That Are Capable of High-Assurance Separation of Differing Trust Levels
Instead of administering security policies on a VM (virtual machine)-by-VM basis, security policies based on logical attributes will be used to create zones of trust -- logical groups of workloads with similar security requirements and levels of trust. As the policies are linked to groups of VMs and not physical infrastructure, the zones adapt throughout the life cycle of the VM as individual VMs move and as new workloads are introduced and assigned to the trust zone. Private cloud infrastructure will require security services that are designed to provide high-assurance separation of workloads of different trust levels as a core capability. Gartner estimates that by 2015, 70 per cent of organizations will allow server workloads of different trust levels to share the same physical hardware within their own data centre, except where explicitly prohibited by a regulatory or auditor compliance concern.
Separately Configurable Security Policy Management and Control
Security must not be weakened as it is virtualized and incorporated into cloud-based computing infrastructures. Strong separation of duties and concerns between IT operations and security needs to be enforceable within a private cloud infrastructure, just as within physical infrastructure and virtualized infrastructure today. This separation occurs at multiple levels. If software controls are virtualized, we should not lose the separation of duties we had in the physical world. This requires that virtualization and private cloud-computing platform vendors provide the ability to separate security policy formation and the operation of security VMs from management policy formation and the operation of the other data centre VMs.
'Federatable' Security Policy and Identity
Private clouds will be deployed incrementally, not all at once. They will be carved out of existing data centers, where only a portion has been converted to a private cloud model. Ideally, private cloud security infrastructure would be able to exchange and share policies with other data centre security infrastructure - virtualized and physical - and security controls placed across physical and virtualized infrastructure would be able to intelligently cooperate for workload inspection. Furthermore, security policies designed to protect workloads, when on premises, would also ideally be able to be federated to public cloud providers. There are currently no established standards for this although the VMware vCloud API is a start, as is work within the Distributed Management Task Force (DMTF) to extend Open Virtualization Format (OVF) to express security policy.
On Friday, 10 September 2010, the National Coordination Office for the Networking and Information Technology Research and Development (NITRD) Program issued a call for public comment on the draft 2010 Strategic Plan for NITRD. The draft developed by the NITRD agencies responds to a recommendation of the Presidents Council of Advisors on Science and Technology (PCAST) and is part of the NITRD Programs ongoing strategic planning process.
Archivists and records managers may be interested to note that the plan specifically mentions the need for continued research into records, data and information management and preservation issues. It is an entirely unprecedented development for these issues to be included in the President's strategic plan for NITRD. See for example:
Nor do we yet have a rationalized, robust information infrastructure for the long-term preservation, curation, federation, sustainability, accessibility, and survivability of vital Federal electronic records and data collections, such as those overseen by NARA. (pg. 17)
The plan also specifically highlights some of the collaborative advanced research that NARAs Center for Advanced Systems and Technologies (NCAST) is engaged in. See for example:
Information standards: Data interoperability and integration of distributed data; generalizable ontologies; data format description language (DFDL) for electronic records and data; data structure research for complex digital objects; (pg. 17)
Information management: Intelligent rule-based data management; increasing access to and cost-effective integration and maintenance of complex collections of heterogeneous data; innovative architectures for data-intensive and power-aware computing; scalable technologies; integration of policies (differential sensitivity, security, user authentication) with data; integrated data repositories and computing grids; testbeds; sustainability and validation of complex models; and grid-enabled visualization for petascale collections (pg. 18)
A primary function of cyber infrastructure is to provide for the safe, secure creation, transmission, storage, and retrieval of all kinds of digital information - including sensitive data belonging to individuals, private-sector organizations, and government. Ideally, both the creator and any recipients or viewers of nonpublic digital information should be authorized and should be able to access it securely; identify its origin and history, or provenance; authenticate its integrity (no one has tampered with the content); and maintain its confidentiality as required. (pg. 22)
The draft 2010 Strategic Plan for NITRD and instructions on how to provide comments can be found at this URL:
Comments are due no later than 5 p.m. EDT on 11 October 2010.
Sept. 1: Bob, Paul, and I met today to discuss developing this important paper. The plan of record is as follows:
a] Objectives: Develop an implementation-focused, practice focused reference model for ILM2.0. Not as an absolutely detailed implementation methodology, but as a framework that allows customization and hybridization as needed to match each organization's unique requirements. It will be analogous to a reference architecture with supporting practice methodologies, terminology, and design goals.
-- We have redefined ILM2.0 away from the original use of Information Lifecycle Management by the storage industry and aligned it with service management methods such as ITIL and ITSM as well as with definitions and thinking in use by the enterprise information management community.
-- Unlike enterprise information management or even "information asset management" as defined and espoused by the MIKE2.0 Community, we define ILM2.0 as a core component of an enterprise information management practice methodology.
- Original SNIA Definition: “
…the policies, processes, practices, services, and tools used to align the business value of information with the most appropriate and cost effective infrastructure from the time information is created through its final disposition.”
- New ILM2.0 Definition: "ILM2.0 is a service management methodology that uses the requirements for information to define the service objectives that the datacenter infrastructure is required to support."
Commentary: Instead of focusing on "managing information,” the ILM2.0 methodology turns the old-thinking 180 degrees and solves the underlying complexity problem by doing so. Instead of "managing information," instead of "aligning the business value of information," ILM2.0 says manage the infrastructure and its supporting services based on the requirements for the information and data that it has stewardship over. How else can you scale and automate a datacenter to cost-effectively apply business, compliance, legal, and information policies across trillions of information and data objects distributed within and without a global organization?
-- Reference for this definition is espoused in the "ILM2.0: The Next Phase", 2010, paper and in the "Building a Terminology Bridge: Guidelines for Digital Information Retention and Preservation Practices in the Datacenter'" from 2009.
b] Utilize and reference work already developed, specifically, the following documents:
-- "ILM Maturity Model": The maturity model defines the metrics to allow organizations to gauge their internal implementation progress against the reference model. The Maturity Model also provides a simplified view of the functional requirements for a comprehensive implementation
-- "Storage Service Management: The Foundation for Information Lifecycle Management
": This document espouses the following: "Storage assets support business processes; it is the business process that dictates availability, business continuity, performance, security, and all other aspects of how storage is budgeted, provisioned, and used. The principles of information lifecycle management help to define those requirements in terms of the ITIL philosophy: (and across practice areas such as these:)
- Service Level Management
- Business/Service Continuity
- Availability Management
- Performance Management
- Capacity Management
- Financial Management for IT Services
- Incident Management
- Problem Management
- Change Management
- Release Management
- Configuration Management
-- "Building a Terminology Bridge: Guidelines for Digital Information Retention and Preservation Practices in the Datacenter
Please comment and sign up
to contribute to this important work.
I came across
a very interesting report recently. It
was from AIIM Market Intelligence and it was titled State of the ECM Industry
2010. AIIM is an industry organization,
the Association for Information and Image Management. ECM is Enterprise Content Management.
© AIIM 2010,
polls the AIIM membership, so the respondents all should be ECM-aware. The survey results do not apply necessarily
across all of IT. But a number of
interesting findings came out.
management projects are being driven by a need to organize an organization’s
wide range of data and their repositories.
Content chaos is the interesting term used. It is not just about regulatory compliance or
cost savings. This implies that
organizations are realizing that their information has value, and that the
value can’t be captured if the information can’t be found easily. The value can come from improved
collaboration and information sharing, and not just data mining. This increases employee productivity and
makes organizations more efficient.
seen mostly as supporting litigation and contracts. This is in addition to regulatory mandates, but
really this is another aspect of organizational efficiency. So it is interesting to see how defining
compliance can vary.
with full ECM systems lack confidence in their non-email electronic information
11% of the time. Without an ECM system,
this rises to 66%. For email, even those
with ECM systems lack confidence 49% of the time. So ECM systems give people
confidence in their data, but not in the case of email. It would be interesting to speculate what is
tricky with email. Is there just too
much of it? You may have x-ray vision to
search for needles, but is the haystack just too big?
isn’t a single ECM system for the organization, portals can provide a single
point of access for end users. For 32%
of the respondents, this was good enough.
So many organizations could be happy with point content management
popular. Only 28% of the respondents
don’t have SharePoint or plan to implement it.
Generally it is viewed as complementary to an ECM system, with 16% using
SharePoint as their ECM system or in competition with an ECM system. So ECM systems need to be implemented with a
plan for the information residing on SharePoint.
organizations don’t allow social media or use instant messaging. Of those that do, 80% aren’t archiving
it. This creates a big gap in an
organization’s archived information.
for ECM systems remains problematical.
Less than 10% of the respondents use or plan to use a private
cloud. Less than 5% use or plan to use a
public cloud. Clearly much needs to be
done to get organizations comfortable with cloud storage based ECM.
many other pieces of information in the report.
Mine it for what interests you.
Comment on my statement about managing information in a complex enterprise datacenter environment.
"Traditional approaches to information management, risk management, and GRC are too costly, complex, human dependent, and in the end work against methods used to solve large-scale complexity problems. In 2003, when we first launched ILM-based practice methods, I defined a complexity rule that states "If you want to solve a complexity problem, stop doing it." and its first corollary, "Automating a bad practice is still a bad practice." These declarations are more true than ever today. Stop trying to manage information and instead, manage the datacenter and its infrastructure based on the businesses requirements for its information and data assets.
To succeed in information management and governance, you need to implement an ILM2.0 framework!"
DOLPHIN WHITE PAPER INFORMATION LIFECYCLE MANAGEMENT
Where data archiving is about performance, Information Lifecycle Management is about compliance. The distinction isn’t a break with the past – it’s an evolution.
Data archiving strategies are designed to improve performance and manage costs, moving static data from the online database to offline archives. Information Lifecycle Management (ILM) adds a new level to the performance‐cost dynamic: manage data in compliance with outside and corporate retention rules and business requirements; maintain ready access to data and documents; protect the organization from legal and business risk.
SAP calls Information Lifecycle Management the combination of “policies, processes, practices, and technologies used to align the business value of information with the most appropriate and effective IT infrastructure from the time information is conceived through its final disposition.” At its core, ILM is the practice of thinking and acting strategically about how data is managed in the organization: where it is stored, how quickly can it be accessed, how it is tracked, and how long it is retained.
The rising adoption of information lifecycle management for the mass of documents and records generated by a business is being driven by a range of business priorities, primarily:
1. Compliance: Ensuring regulatory compliance through management of the information lifecycle, from creation to record retention and finally, destruction.
2. Performance: Online database growth slows enterprise network and database performance. ILM strategies create a process for moving static ‘business complete’ records from the database to the archive ‐‐ freeing online disk space while maintaining seamless archive access for users.
3. Cost: SAP users typically experience database growth at 20%‐30% per year. Data archiving strategies, in concert with ILM practices, move data from online storage to lower cost archival systems, slowing the growth of infrastructure maintenance and reducing total cost of ownership for data management.
4. Preparation: An ILM strategy enables the organization to plan for changes, including system upgrades, mergers and acquisitions, and manages challenges like legacy system decommissioning and legacy data access.
Here are my thoughts:
a] I think it completely confuses ILM with archive. What it calls archiving is really tiering. Not one iota of this is about 'archiving' as properly defined.
b] The SAP credit really belongs to the SNIA (circa 2003). SAP, while never an active member of the ILM Initiative within SNIA, certainly monitored and adopted the concepts and principles we were espousing - which is great.
c] The attempt at an explanation for what "ILM really is" is a good effort and to be applauded. However, it is yet another misleading platitude to information management. It promotes an approach that misses the fundamental principles we are trying to teach in ILM2.0 -- that of managing the infrastructure to meet the business requirements for the information, not managing the information. We need to invert traditional thinking and stop trying to 'manage information'. Here's the point. If I said, you "can't really manage information in a large enterprise," would it get the idea across? Complexity, cost, security, training, risk management, etc. overwhelm our ability to manage information. Where we have to go is to automate practices to drive out complexity and cost. To do that takes a big-bucket approach and a service management style process beginning with classification and requirements setting not with generalities.