Agile Big Data and Many-Particle approach change Marketing and Sales effectiveness

Big data projects have broad impact on organizations. Big Data implementation overtakes
what normally Many-Particle data aggregation goodcould be considered a new way to conduct data management to business alignment. With Big Data the path from data sources to data intelligence changes drastically. The way to design and implement data intelligence definitively changed access, ingest, distil, processes and data visualization as well. Big data projects meet agile implementation, shorten the data intelligence lifecycle by increasing services capability and adequacy to fast-growing datasets, fast moving business. Accordingly, agile practice and many-particle approach minimize data entropy together with data access time cycles everywhere, preserve data security and enhance user experience to business instant realignment.

Contents
Introduction
Data Topology and Agile Big Data
The Many-Particle approach
Conclusion
Acknowledgment
References

Introduction
The way to move from today business data into Big Data intelligence could be a costly and time consuming process that could decrease the tremendous advantage of the Big Data and Cloud paradigms. Today, information is still misaligned with the business although the huge efforts of the past business intelligence projects: companies still use partial quantities of the real corporate data heritage. As a consequence, the data spectrum exploited is unpredictable and the process to align data and business is a long-term process. Agile Big Data aligns instantly data heritage and business data. Continuous data ingestion and distillation drastically reduces ETL process to run intelligence on the “big data-lake” when needed. Then, on-premise big data topology and functional data intelligence have a crucial role to meet profitability, customer affinity and fast moving business goals. This paper introduces the business case for Big Data to avoid Marketing and Sales data entropy, reduce risks and increase the likelihood of an aware and successful Big Data implementation.

Data Topology and Agile Big Data
Documenting data evolution and updating in the past could be considered a good practice in managing data. In the beginning of cloud paradigm, due to the cost cut down attraction, the practice to have a map of the company data heritage became a great benefit especially when services have to be subscribed in the cloud. Data models, a way to document the data heritage, evolved into MaaS (Model as a Service) that supports agile design and deliver of data services in the Cloud and makes the difference in planning a Big Data implementation project.

Considering data models doesn’t mean structured data only. On-premise models map data coming from structured, semi-structured and non-structured sources. Data models maps defined services topology would be moved on-premise or in the cloud. Still data models is needed for early exploration analysis and “ab-initio” services classifying parameters which define services boundaries (to personal cloud, financial parameters or healthcare positions, for example); data models (on SQL, No-SQL, Vectors or Graph structures) essentially doesn’t address the meaning the data have but identify the services’ classes before creating the data-lake. Of course, into the data-lake converge unusable data, unstructured or denormalized raw datasources as well. The more aware is the on-premise topology, the more secure and localizable is the big data usage both on-premise and in the Cloud. Further, agile MaaS approach reveals business process affected, operating requirements and stakeholders.

Big Data CorporateFig. 1 – Corporate Data-Lake and Agile Big Data approach

Accordingly, agile Big Data practice sets the link among on-premise data topologies and on-premise or in the cloud data intelligence. Topology leverages the company services asset into specific business objectives and will determine the successful user experience requirements and the proper rapid alignment with respect to the competitors.

This means that two crucial aspects have to be taken care:

  • Data is the “compass” to understand services capacity, stakeholders, culture of the organization: big data agility is based on data-driven approach. Therefore, in the incoming project setup minimize functional data behaviour. Use MaaS topology to define projects use cases data-driven. Data-driven project design defines data ingestion architecture and data landing into the data-lake and assist in understanding the best policy for continuous data feeding. Do not disregard this aspect: accurate data feeding is the core of Big Data approaches;
  • Move data analysis and functional aggregation to data intelligence applied on the data-lake. During ingestion and data landing data treatments have to be minimized Agile Big Data approach considers 2 zones: the in-memory one, based on data topology and on-premise supported by MaaS and data intelligence based on functional analysis and programming working on spare data.

Still, minimize any approach based on “ab-inizio” technology and software development. The Big Data ecosystem provides excellent platforms and MaaS agile approach helps to shift later the final technology choice/selection. Further, MaaS agile practice assists to clarify successes and failures zone and set expectations by time. This happens why when services have been set by on-premise topology then a link has been stretched among the data heritage and the data intelligence. There are no constraints between the raw data (documented or not) and the user experience that will leverage functional and business alignment. In the middle, only the data-lake exists, continuously changing and growing, continuously supplying information for the data intelligence ending.

The Many-Particle approach
Today, more of 70 percent of the world’s information is unstructured, not classified and, above all, misused: we are assisting to the greatest Marketing and Sales data myopia since they exist. Still, there is no awareness of the Big Data benefits for service and/or product companies, and again how the product’s companies can change their services based on goods production: great amount of data, exceptionally growing, high entropy, unknown correlations and limited data usage. The concept of on-premise topology introduces services as data-driven aggregation states applied to given parts of the data-lake. But this is what happens to many-particle system instability (yottabyte is 1024 byte with a binary usage of 280). Big data storages dimension near data-lake to many-particle systems. This vision destroys any traditional approach to Marketing and Sales.

If we consider the big data-lake, it contains fast moving content in order of data affinity and mass correlation. Depending upon dynamic data aggregation, data topologies may change by tuning on-premise data mapping. Consider data-lakes are mainly fed through:

– ingestion, distillation and landing from content based (datasources, datasets, operational and transactional DB’s);
– ingestion and distillation from collaborative feeding (dynamic collections of large amount of information on users’ behaviours coming from the internet, direct and/or indirect).

Collaborative ingestion can be managed as a content based as well in case of time needed to data intelligence ending has no strict constraints so to define a third method, the hybrid one.

This brief introduction tries to explain that the data-lake maps ab-initio topologies to services but also may classify more ecosystems the services are defined and applied to. Services live in the ecosystems and ecosystems depend upon data aggregation (why used, where used, how used, who uses) and just like aggregation states, big data density change dynamically. These changes are a consequence of datasources ingested, users experiences, customers behaviours, ecosystems interaction and, of course, business realignment. Marketing and Sales should change accordingly. But since data-lake may grow by 40 percent per year (in line with the estimation of the worldwide rate of information growth taking into account that unstructured data is growing 15 times faster than structured data – source IBM®), there is no way to get any (predictive) control for marketing and sales organization although data warehousing and/or sophisticated traditional data mining and analysis are in place.

Anyway, the data growth will be greater than ever in the next years and so the variance for data aggregation in the data-lake will have an exponential rising: this means many opportunities could be lost and again further marketing and sales entropy. Ab-initio topology by agile big data approach and functional programming applied to the data-lake supply the best answer for prescriptive analysis on many-particle big data systems. In fact, the data-lake allows to work on data cross-aggregation optimization, customer experience and aggregation states for services realignment with respect to the business ecosystems. Still, data-lake is an extraordinary real-time “what-if set” for prescriptive scenarios, data processing assumption and data risk propensity.

Data-Sea

Fig.2 – The Data-Lake is quickly becoming a Data-Sea with multi-particle-like data behaviour and dimension

Banking and Goods Production are 2 typical examples of Big Data agile implementation. Both are supplying services. Both are trying to align instantly and proactively offer and business changes. Banking and Financial services play a strategic role in relationship management, profitability performance to corporate groups, client companies and commercial banking networks. This is why financial applications need to be rapidly synchronized to ecosystems fluctuations states as ecosystem participants’ change everywhere their behaviour due to local and international business conditions. Functional big data paradigm working on many-particle data aggregation is prescriptive with respect to unpredictable services transition: it agilely realigns ecosystem services directions over on-premise data topologies mapping.

Goods production may tune services as a consequence of user’s experience by, for example, executing more focused and less time-consuming recommender systems. Goods production companies are in the run to provide personalized technical and commercial services, greater client loyalty and prescriptive offers starting soon when the clients interact or navigate the company website. With agile big data and many-particle approach, goods production potentially increases user similarity by data-lake massive data aggregations. Fast moving data aggregations constantly feed functional data intelligence to services realignment and topological correlations repositioning on-premise data similarities.

Two different paces, the same objective: be prescriptive, understanding “at earlier” which data aggregation state is the most proper along the data-lake instability and then contiguously realign products offer, services configuration and, consequently, keep ecosystems oversee: on-premise topology gauged on data-lake volume, data velocity and variety allows Marketing and Sales to tune on effective data aggregation to promptly adjust services to the ecosystem.

Conclusion
Client sentiment and user experience behaviour analytics allow rapid changes to product offerings or customer support which in turn enhance customer fidelity and business improvement. However data are growing exponentially and business alignment have to be provided in more decentralized environments. Agile MaaS approach based on data-driven raw volume, data velocity and variety together with on-premise services topology is a relatively low cost and light model. Topology does not influence data treatment. Data remains intact although services integrity and classification drive business, user experience and ecosystems alignment. Accordingly, agile practice and many particle approach we introduced minimize data entropy together with data access time cycles everywhere, preserve data security and enhance user experience to functional visualization realignment.

Acknowledgment
I sincerely thank Paolo La Torre for his precious feedback on contents and encouragement on publishing this paper. Paolo is working as Commercial, Technical and Compliance Project Supervisor for Big Data planning and engagement directions in finance and banking.

References
N. Piscopo, M. Cesino – Gain a strategic control point to your competitive advantage – https://www.youtube.com/watch?v=wSPKQJjIUwI
N. Piscopo – ID Consent: applying the IDaaS Maturity Framework to design and deploy interactive BYOID (Bring-Your-Own-ID) with Use Case
N. Piscopo – A high-level IDaaS metric: if and when moving ID in the Cloud
N. Piscopo – IDaaS – Verifying the ID ecosystem operational posture
N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data
N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
N. Piscopo – Applying MaaS to DaaS (Database as a Service ) Contracts. An introduction to the Practice
N. Piscopo – MaaS applied to Healthcare – Use Case Practice
N. Piscopo – ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
N. Piscopo – CA ERwin® Data Modeler’s Role in the Relational Cloud
N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle

Disclaimer – This document is provided AS-IS for your informational purposes only. In no event the contains of “Agile Big Data and Many-Particle approach change Marketing and Sales effectiveness ” will be liable to any party for direct, indirect, special, incidental, economical (including lost business profits, business interruption, loss or damage of data, and the like) or consequential damages, without limitations, arising out of the use or inability to use this documentation, regardless of the form of action, whether in contract, tort (including negligence), breach of warranty, or otherwise, even if an advise of the possibility of such damages there exists. Specifically, it is disclaimed any warranties, including, but not limited to, the express or implied warranties of merchantability, fitness for a particular purpose and non-infringement, regarding this document use or performance. All trademarks, trade names, service marks, figures and logos referenced herein belong to their respective companies/offices.

Integrated “App to App Synergy” is key to Cloud ROI and enterprise social media strategy

business-valueThe Forrester report ‘Total Impact of Google Apps‘ provides a framework for identifying the ROI from switching from a legacy mail and collaboration platform to Google Apps.

Surveying 200+ organizations with over 1,000 employees, Forrester identified the main business benefits required to build an ROI framework for moving to a Cloud-based service, which essentially broke down into two main categories:

  • IT infrastructure cost savings
  • Staff productivity increases

The IT infrastructure savings provides the hard numbers for justifying the business case, enabling cost reduction in areas you would expect like software licences, hardware and IT administrator costs, and delivers an ROI:

  • Break even within 1.4 months
  • 329% risk-adjusted ROI
  • A Net Present Value of over $10m following an investment of $400k

For many organizations the catalyst for the switch is a degrading email system experiencing performance issues that have become intolerable to the mission critical status that email now has for the organization.

So this means making the move solves this burning need and simultaneously maximizes cost savings through avoiding future costs related to maintaining on-site IT infrastructure, such as winding down their need for equipment like VPNs.

google_plusApp to App Synergy

This immediate payback means that the additional ROI benefits from staff productivity are a ‘free bonus’ on top, however it’s important to note just how beneficial these are and how the business can exploit them.

Indeed Forrester then make the critical point that even greater business impact is enjoyed through increases in staff productivity, highlighting features and benefits like more efficient document collaboration and more effective virtual meetings.

These are more intangible ROI benefits, not as easy to quantify in immediate $$ dollar terms, however by considering them within a broader context of process improvement, and identifying how these personal productivity gains are component parts of broader workflow enhancements, then senior executives can start to relate to them in terms important to them.

For example:

  • Sales teams producing more client proposals faster
  • Quicker resolution of customer service issues
  • Improved technical documentation

A key mechanism to achieving this productivity improvement is what Google describe as ‘App to App’ synergy, referring to the SSO (Single Sign On) environment that Google offers for the suite of collaboration tools. They highlight how staff enjoy time-saving productivity boosts by accessing email, VoIP and video conferencing all from within the web browser.

Social Workflow – 9x Process Improvements

This may seem like a relatively unimportant technical feature in the grand scheme of enterprise applications, however when you consider that one of the primary issues IT faces is user adoption of new applications you can see just how key it actually is.

Indeed these principles are crucial for organizations also considering their enterprise social media strategy – How to internalize Web 2.0 tools like building your own private Linkedin type site or encouraging better knowledge sharing through staff use of blogs and wikis.

In this article the originator of this Enterprise 2.0 concept Andrew McAfee described how these new social media technologies faced the challenge that use of email was so entrenched that the new tools would need to be 9x more useful for them to switch, not just a little better.

Given just how much staff work within email on a day to day basis it is probably not even possible that they even would at all, and so Google addresses the situation by “bringing the mountain to Mohammed” – They embed these new collaboration methods directly into email.

There is no need to switch out to different social media and collaboration apps, they are built direct into the email interface itself.

e1This blending together is key to unlocking the transformational power of these technologies.

In an earlier white paper that builds on Andrew’s Enterprise 2.0 work, ‘Harnessing the Wikipedia Effect’, I described how it enables ‘Knowledge Process Management’, referring to blending together the previously separate applications for Knowledge Management, those for Business Process Management and also the communications and collaboration apps.

The trend is also very effectively described in this article – Enterprise Apps Get Social.

What Google Apps is offering is this powerful effect distilled into an easily accessible online service, the essence of the business value of Cloud Computing.

GovCloud 2.0 – Harnessing Cloud Computing for Digital Government Transformation

Govcloud-2-logo2Recently the USA Government announced a Digital Government strategy.

Our new white paper ‘GovCloud 2.0’ describes how Government Cloud Computing can provide an accelerating platform for implementing this policy, covering topics such as:

  • Business Transformation – Harnessing social media and collective intelligence models to reinvent government to be more collaborative.
  • Procurement Frameworks – Cloud procurement best practices, such as the UK’s G-Cloud.
  • Digital Identity – How the  Identity Ecosystem will secure and streamline government workflows.
  • Shared services architecture – Applying Cloud Computing models to better share infrastructure costs between collaborating agencies.

download_pdf1Download 25-page PDF

The Media Cloud – Transformation of news and media

Canada-CloudIssue one of our e-magazine TRANSFORM focused on Government, Issue two on E-Health.

The next installment will look at the media industry and the disruptive effect Cloud will have on media firms such as newspapers, magazines, the entertainment industry, PR firms and more, covering topics such as:

  • Cloud Gaming!
  • Case studies of media firms embracing and mastering the Cloud
  • Disruptive business model capability – How Cloud can enable new media business models
  • Integrated social media architecture
  • Cloud services showcase – Presentations of the latest relevant Cloud services innovations: Streaming, CDN capacity, etc.

This will also be the headline theme of one of our local Canadian Meet Ups, where entrepreneurs can meet with industry executives to explore these ideas in more detail.

How to Become a CIO (Chief Innovation Officer) – Part 2

Part 2 of our Cloud CIO series looks at Shared Services and Outsourcing, by focusing on an example case study of the Province of Ontario.

One of our key goals of the CCN is to showcase leading CIO pioneers of Cloud adoption in Canada, and Ontario is one example, led by visionary CIO Dave Nicoll.

Citizen Service Innovation

Furthermore this case study is very powerful because it also highlights how a shared services approach is an enabler of Part 1 of the series – Technology-enabled strategy.

For government this refers to improving citizen services through better online functionality, and in the case of Ontario they are utilizing powerful Search software (Sharepoint FAST) to enable government agencies to build Service Locator functions into their web sites.

This improves citizen access to government information, greatly improving their online experience and making agencies easier to deal with.

This is part of Ontario’s strategy to leverage Cloud Computing to improve online service delivery. As documented here Ontario is a flagship case study of the Microsoft Private Cloud platform, where CIO Dave Nicholl describes:

Cloud computing is the way of the future. This partnership was a first successful step in the direction toward a better way of using and delivering online services.

BPaaS – Business Process as a Service

This Private Cloud implementation demonstrates the relationship between new Cloud Computing architecture and the role they can play in facilitating new shared service outsourcing models.

Specifically Ontario is pioneering an approach known as BPaaS – Business Process as a Service.

They are using the software to power a ‘Service Location Finder’, a search engine capability that can match a request for ‘Drivers Licence’ to nearby offices that provide this service.

One central installation of FAST software is configured to provide multiple service finders to different departments, providing this specific business process as a shared service.

This means that each department not only avoids the hassles of IT installations and maintenance, but even further they avoid the complexities of software configurations too – Instead they simply consume service at the business process level.

No Wrong Door – Critically this helps unify information from across multiple different agencies, making online service quicker and easier for citizens.  A ‘No Wrong Door’ policy means that a citizen can search on any one of these different sites, but will always get a consistent answer then and there, not be shuttled between departments.

For example searching for Drivers Licence on any number of different sites will link them to the relevant services such as applications, renewals and related processes that are spread across multiple sites.

Conclusion

Shared services is a very well established model for consolidating the needs of multiple organizations into one service delivery unit, in a manner that saves money for all of them, and the type of activity for the more business-oriented CIO to engage in to drive cost reductions.

When key technologies like online Search are implemented this way, there is a double benefit of also increasing customer-centric innovation, a very powerful formula.

How to become a CIO (Chief Innovation Officer) – Part 1

On a previous blog I asked ‘Does Your City Have a CIO (Chief Innovation Officer), referring to our recent Cisco telepresence session featuring Jay Nath, who has been appointed to this role for the City of San Francisco.

Of course the CIO initials also stand for Chief Information Officer, and this duality sharpens the focus on the question of how this role might evolve. There is lots of talk about how the CIO might become more strategic, more of a boardroom-level business leader capable of driving transformational change.

For example could it be the technology director who takes on this innovation role? Or is this simply too much of a stretch for a role that is dealing with operational level detail?

I would say yes, it’s certainly possible. Innovation isn’t a magic art, instead it’s like any other function of business, it’s repeatable via best practice models, and so it can be baked into the daily operations of the IT team, indeed doing so is key to achieving this business focus.

The Strategic CIO

For CIOs looking to develop a new presence at the boardroom table, it means elevating well above the level of installing and configuring servers and software, and instead being able to relate to the organization in strategic terms, achievable through capabilities in key areas:

  1. Technology-enabled Strategy – Growing the organization through IT
  2. Outsourcing and Process Automation – How to design and leverage shared services
  3. Portfolio Transformation – Managing change as an investment portfolio

1. The Cloud Strategy Map

The central backbone to this evolution is technology-enabled strategy: What role can IT play in significantly enhancing the business model for the organization – How can it help increase sales, improve efficiencies, reduce costs, etc.?

Proactively identifying these opportunities, and leading their implementation, is the primary hallmark of a strategic CIO, rather than always simply reacting to requests.

Robert Gold documents a repeatable maturity model for this in his article Enabling the Strategy-Focused IT Organization.

This covers the issues that arise that cause business management to perceive IT to be overly expensive and failing to align these costs with benefit to their business units.

Fundamentally Gold defines a scale where at one end IT is perceived and managed as a cost and at the other it;s leveraged as a strategic enabler, able to achieve competitive advantage. Central to this is the ‘Strategy Map’, a framework for achieving this strategic view of IT, and in our ongoing Cloud CIO series we’ll explain how this can be tailored and adapted specifically for Cloud Computing.

Coming Next

In Parts 2 and 3 we’ll explore these other two areas: Outsourcing and Process Automation, and Portfolio Transformation.

Cloud Migration Management – White paper

Our Cloud Migration section is intended to provide a full suite of best practice procedures for planning and implementing migration of legacy IT and business processes to the Cloud.

This includes an updated white paper, Cloud Migration Management, that sets the initial CIO-level strategy, covering topics such as:

  • How moving to ITaaS should be linked to business process improvement
  • Achieving a level 5 BPM maturity model
  • Architecture-Driven Modernization

Read our white paper on Cloud Transformation best practices. (8-page PDF)

CIO as Innovator: Enabling the Strategy-Focused IT Organization

We are now in process of launching our flagship best practices publication, the Cloud CIO guide @ CloudCIO.info.

While the technology aspects of Cloud Computing are important, what is actually most noteworthy about the trend is the transformation of the role of the CIO, to become a board-level change agent specializing in innovation.

A headline example is Government. Traditionally you would consider the public sector to be the least innovative, to be the last to adopt any form of new and disruptive technology and certainly not to be a leader of innovation.

However the nature of the trend is illustrated by just how much the opposite is now true.

Digital Government: Future First Architecture

Recently the new Whitehouse CIO Steven VanRoekel announced their ‘Digital Government’ strategy, reflecting this broader perspective.

Highlighting this innovation focus, it has evolved Cloud First to what they now describe as a ’Future First Architecture’ that brings together shared service Cloud Computing, XML Web APIs, mobile apps, Big Data, and more.. It’s nicely explained in this GovLoop guide.

The recent US Innovation Summit highlights this new role of CIO as innovator, and described in this article the US CTO Todd Park describes how xxx:

“Before coming to government, I spent my almost my entire career as an entrepreneur,” Park told FedScoop before his presentation, “and in that time I’ve never been surrounded by as many who have that same entrepreneurial drive and spirit than I do right now in the federal government.”

The Cloud Strategy Map

Other industries and their CIO’s can repeat the same effect, and there are structured methodologies to help with this process.

Robert Gold documents a repeatable maturity model for this in his article Enabling the Strategy-Focused IT Organization. This covers the issues that arise that cause business management to perceive IT to be overly expensive and failing to align these costs with benefit to their business units.

Fundamentally Gold defines a scale where at one end IT is perceived and managed as a cost and at the other it;s leveraged as a strategic enabler, able to achieve competitive advantage. Central to this is the ‘Strategy Map’, a framework for achieving this strategic view of IT, and in our ongoing Cloud CIO series we’ll explain how this can be tailored and adapted specifically for Cloud Computing.

Cloud Transformation through ‘Future First’ Architecture

The backbone to the consulting services from the CBPN is the CTM best practice – Cloud Transformation Management, and the primary outputs from these engagements will be CTM Roadmaps.

CTM Roadmaps enable migration to Cloud Computing through Shared Service-based models.

Shared Service Roadmaps

The fundamental role of the CBPN is to catalogue Cloud Best Practices, especially those from the USA Government, where NIST and CIO.gov regularly produce the thought leadership materials defining Cloud designs.

Forbes.com recently reported that the USA Government has so far saved $5 billion from their ‘Cloud First’ initiative to drive uptake of Cloud Computing in the American public sector, with a full $12 billion still to come.

As well as reporting these savings they have also continued publishing their best practices underlying these advances, most notably defining Shared Services as a core ‘target architecture’ to direct these Cloud migrations towards.

In this White house paper ‘Federal IT Shared Services strategy‘ (16 page PDF) it explains how Shared Services is one of the key design foundations of the 25-point Cloud First plan that is achieving these massive cost savings.

It also explains that agencies are expected to produce Roadmaps for how they will migrate to shared service models as part of embracing Cloud:

Develop Roadmaps for Modernization & Improvement of Existing Services – Each Managing Partner will develop a roadmap for improvement of existing services. Agencies and OMB will work together to monitor progress toward these goals throughout the year.

Helping agencies produce these roadmaps quantifies how the Cloud channel ecosystem will evolve, involving consulting firms and systems integrators, as well as Cloud providers, where this group acts to:

  • Conduct Readiness Assessment – Analyze the existing IT estate, produce inventory reports of IT components.
  • Cloud PfM – Cloud Portfolio Management, using PPM techniques to financially manage the migrations to Cloud providers
  • Cloud Service Brokerage – Leveraging SaaS options for service brokerage to plan, cost up and manage the migrations.

Future First target architecture

This guide also provides a high level vision for a single ‘target architecture’, the end goal intended from this business transformation exercise, what they describe as a ‘Future First’ architecture.

This is a a new common model that encourages agencies to embrace and drive business benefit through shared service transformations as well as IT ones: including:

Each shared IT service offering must have a business and technology architecture that fits the operating model and supports new Future First Federal EA methods. There are a number of general design principles that apply to Future First architectural designs for shared IT services, including:

  • Multiple consumers for each service, with minimal customization;
  • Process standardization (commercial product/workflow adoption);
  • Web-based solutions with standardized application interfaces;
  • Object reuse, machine-readable data, and XML data formats;
  • Cloud-based application hosting and virtualization of servers;
  • Security controls and continuous monitoring of service operations;
  • Configuration management and version control.

Recommended paper : US Federal Strategy for the Safe and Secure Adoption of Cloud Computing

This 19 page PDF document from Bill Perlowitz of Apptis provides a comprehensive summary of “G-Cloud Computing”, from an American point of view, their ‘FCCI’ – Federal Cloud Computing Initiative.

[Read more…]