Quantcast
Channel: Latest News from The Cloud Computing Newswire
Viewing all 477 articles
Browse latest View live

Wasabi Hot Innovations Tour: How "Hot Cloud Storage" Changes Everything!

$
0
0

Digital storage requirements are growing exponentially. Budgets simply can’t keep up and existing Federal Data Center Consolidation Initiative (FDCCI), “Cloud First” Policy, Federal IT Acquisition Reform Act (FITARA) and Modernizing Government Technology (MGT) Act challenges aren’t going away. On top of all that, e-discovery, data privacy and digital forensics have made rapid data access and immutability absolute must haves.
  • What are your plans for addressing these issues?
  • How can you manage the generation of even more unstructured, IoT and “Big” data?
  • Will you get through your next IG review?

Hot cloud storage changes everything. Here’s how.
  • 80% Cheaper than the cheapest
  • Faster than the fastest
  • Safer than the safest
  • Unlimited fee egress – no additional charges for download from the cloud
  • Available as public or private cloud options
  • Immediate access and built-in immutability
  • HIPPA, HITECH, CJIS, SOC-2, ISO 27001, and PCI-DSS certified.
  • FedRAMP certification in progress

“Hot cloud storage” is currently being used by Cloud Constellation Corp., Acembly Television Broadcasting, 7 Wonders Cinema and multiple government organizations. Maybe your team should consider it as well.
Featuring:

Kevin L. Jackson, CISSP®, CCSP®

Also quoted by SAP, AT&T, Accenture, Ericsson, Forbes, Dell and others.



Individually scheduled consultations available with on-site experts


( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more


#DigitalTransformation Means Hybrid IT and Multipath

$
0
0

The cloud is ubiquitous in today’s business world. This operational model is changing both data center operations and application development processes across multiple domains. As the manager of data centers for some of the most significant companies in the world, IBM has observed this profound shift in the strategic outsourcing marketplace. These shifts are being driven by:
  • Financial changes designed to minimize CAPEX and maximize OPEX.
  • Acquisition changes that prioritize as-a-service IT consumption and multisource procurement.
  • Operational changes that shift from data center ownership and physical management toward the virtual management of IT services from data centers owned by AWS, Azure, IBM and Hosted VMware providers.

What’s Driving Innovation?

With these changes, organizations are challenged to update their internal processes in a manner that addresses changing people skills, new programmable IT infrastructures and the multicloud world. Application development best practices are also transitioning, which is upending enterprise software deployments. These new development strategies are driving companies away from:
  • Monolithic designs, toward designs that aggregate the use of microservices.
  • Agile and Waterfall management models, toward fully automated DevOps.
  • Physical infrastructures and virtual servers, toward containerized environments.
While these transformations are seemingly inevitable, companies embark on their individualized journeys from a variety of starting points and with three different goals. The first goal is to support legacy enterprise application enhancements and deployments that tend to start and remain in an on-premise environment. The second is to implement new application designs that are primarily cloud-native for deployment to public cloud environments. In the face of this bifurcation, data center transformation projects represent the third goal. These tend to start with the automation of infrastructure components, with an end target of fully automated and containerized IT service provisioning and consumption within a DevOps operational environment.
Attaining these three operational goals is why cloud computing has driven rapid innovation, causing unprecedented disruption to the IT operating model. This disruption is revolutionizing how resources are consumed, delivered and governed. Cloud platforms, in fact, enable digital transformation acceleration along all three of these digital transformation vectors simultaneously by helping to answer questions like:
  • How to design and build new apps.
  • How to modernize existing apps with cloud services.
  • How to optimize the IT owned and operated by the enterprise.

Pursue Multipath Digital Transformation

Efficient and effective digital transformation depends on hybrid IT adoption. Transition requires building consensus across multiple constituencies and a focused change management strategy across just about every business process. The change management vectors must address:
  • How IT service consumption is changing into a commerce-like transaction similar to buying songs from multiple online streaming music vendors;
  • How IT service delivery is changing from service requests that initiate workflows towards fully automated functions governed by IT corporate policies;
  • How IT service governance is changing from incident keyed governance after-the-fact to the proactive setting and enforcement of policies that control consumption based on individual business unit usage models and cloud adoption strategies.
The target operating model is one that uses a multicloud federated management model that leverages IT service abstraction. The model lets businesses select and consume unique services and capabilities from each cloud service provider. The Gen 2 IT operations model depends on a high level of infrastructure visibility and governance that controls assets and usage patterns, a common DevOps consumption interface, the implementation of an enterprise scale DevOps model and, finally, the management of IT as a stand-alone business unit.
To pursue a multipath digital transformation to hybrid IT operations, you must:
  • Run a strategic portfolio analysis: Prioritized based on importance to customer experience, focus on high priority apps.
  • Develop your app-by-app migration plan: For priority, application create a focused SWT Team for each application.
  • Create a customer-obsessed cloud team: Think products and not projects (short iterations with quick customer feedback).
  • Pursue relentless automation: This requires standardization.
  • Re-engage with your trusted partners: External advice in the early change of a cloud migration can kick off the required culture change.
The IT world is changing fast, and the end point of this transition is a hybrid IT platform. This environment has on-premise data centers operating collaboratively across a multivendor, multicloud estate. The key to success in this world is to proactively set and enforce policies that control consumption based on individual business unit usage models and cloud adoption strategies. Get ready now, because it’s go time for multipath digital transformation.

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)
This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Taking the Canadian Insurance Industry Digital

$
0
0


“Digital disruption isn’t just for hip start-ups. Incumbents can not
only compete but actually lead radical industry change if they pay attention
to the way their business model is shifting and act boldly in response.

Founded in 1871, Economical Insurance operates within the old and relatively static Property and Casualty (P&C) insurance industry.  According to the American Insurance Association (AIA), the modern insurance industry, developed primarily in England after the Great London Fire in 1666. Today, this insurance industry segment protects from risk in two primary areas:
  •  Protection for physical items, such as houses, personal possessions, cars, commercial buildings, and inventory (property); and
  • Protectionagainst legal liability (casualty).

Property insurance provides for losses related to a policyholder’s person or property while casualty/liability insurance protects a policyholder against the claims of others.

Economical sees itself as the insurance company “For Canadians by Canadians.” They also tout themselves as a company that imagines bigger and better things. Their credo is to focus on the customer first, and this is why Economical is one of this country’s leading and most trusted P&C insurance companies.

With this background and belief, it may come as no surprise that innovative internal thinking led to the conceptualization of a new business model.  One targeted to a market segment that was comfortable with advanced technology. They saw these customers as underserved because traditional service channels were not meeting their unique needs. By feeding on a passion and desire to make a difference in everything they do, Economical business and technology teams partnered to build a new channel built entirely on digital processes.


The vision they saw was Sonnet. Launched in 2016 t,his new brand brought an innovative new insurance experience to Canadians who prefer to purchase insurance directly online. Using sophisticated technology and real-time analytics, customers are now able to instantly get customized quotes, purchase a policy, and make account updates online at any time.


The intent was to leverage and aggregate data from multiple sources, apply real-time analytics and provide a unique personalized recommendation based on the customer’s profile. The twin challenges were finding an evolutionary and transformative path that wouldn’t crater the existing business and a savvy business technology partner that wouldn’t waste the company’s time and money. That partner was IBM.

The most critical element in this effort was speed. With time to market being so important, a public cloud-based solution was the answer. The IBM public cloud allowed Economical to deploy quickly, take advantage of automation to reduced errors and enabled business flexibility.
Another key component of this partnership was software development that used the KANBAN Agile Methodology. Kanban is based on three principles:
  1. Visualize what you do today (workflow) to see all development items within context.
  2. Limit the amount of work in progress (WIP) to balance the flow-based approach and prevent team over commitment.
  3. Enhance flow that uses a priority process to pull in development backlog in as soon as a previous task is completed.


When compared to the well know SCRUM model, using KANBAN promoted continuous collaboration and encouraged active, ongoing learning and improvement by defining the best possible team workflow.


Economical pursued a hybrid cloud strategy to leverage the previous enterprise IT service investment. Significant time was therefore spent in the design phase because the right design required decisions on what went in the cloud and what stayed on premise. Those decisions, in turn, drove data flow, security control locations, and any required infrastructure resiliency improvements. The hybrid strategy also brought with it a need to integrate with legacy systems. IBM was able to meet this and associated requirements to partner with other members of Economical’s business ecosystem.

In the end, the key to an on-time launch of Sonnet was primarily teamwork and a real partnership. Sonnet thoroughly disrupted the P&C Insurance market in Canada. It successfully challenged the status quo and demonstrated that digital transformation was possible with the right technology partner. This success has spotlighted Economical as an industry leader and business innovator. Customer reaction has been extremely positive, and the business has been scaling with a healthy and steady growth trajectory.

One of the many lessons learned by Economical during this process was that disruption is not a one-time event. Organizations must continue to re-invent themselves or competition will disrupt you. This new internal operating model has led to new product suites, new offerings, simplified pricing and new internal workflows. The service offerings will be built on top of the infrastructure put in place for Sonnet and will change the way Economical works with their brokers. Although a bi-modal, two speed IT operations model was initially accepted, this next step will use cloud computing to connect cloud-native components to core legacy assets. This “real IT transformation” will use additional efficiency to fund future transformation initiative through a long-term partnership with IBM. 



This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Procurement in a Virtual Business World

$
0
0

Today, companies are undergoing a dramatic change in their environment and processes.  Many groups these changes together as “Digital Transformation,” but that industry buzzword fails to describe the essential details associated with this change.  A critical and often underappreciated area is procurement and supply chain management.

While these areas are not generally under the bright lights of business, these processes grind the gears of business success.  In building new service procurement processes and virtual supply chains, the Chief ProcurementOfficer must:
  • Identify and adopt a framework for electronic procurement that uses virtual supply chains that incorporate existing state-of-the-art public and private sector electronic procurement and supply chain systems;
  • Financial and acquisition changes designed to minimize capital expenditures while simultaneously maximizing operational expenditures
  • Build and deploy electronic procurement system that uses virtual supply chains that incorporate multi-media, distributed work-flow management, document handling, and electronic contracting procedures;
  • Educate and train users on new processes and systems associated with virtual supply chains and electronic procurement systems; and
  • Build, extend and expand supply chain collaboration and electronic data exchange.

All this must also address the new and sometimes sweeping legal and regulatory requirements around data sovereignty and privacy.

Wedded at the hip with the CPO, the CIO must also find a path through “Digital Transformation.” The operational and deployment challenges faced there include:
  • Prioritizing as-a-service information technology consumption and multi-source procurement
  • Transitioning from data center ownership and physical management toward the virtual management of IT services delivered from third-partydata centers;
  • Dismantling of monolithic software application designs into modern solutions that aggregate internal and externally delivered microservices; and
  • Retraining and transitioning staff away from Agile and Waterfall management models toward fully automated DevOps.
Taken together these activities start to describe the many details associated with acquisition and procurement in a virtual business world. Traditional and legacy management systems that comprised of disjointed point solutions cannot provide a holistic picture of the modern virtual supply chain and hybrid IT ecosystem. CIOs looking to blend legacy data centers with multi-vendor cloud solutions and CPOs striving to create higher value for the business by harnessing new and disruptive digital technologies must join together. This new age relationship must focus on delivering to the enterprise:
  • A unified source-to-pay platform that can provide seamless information, process, and workflows; easy integration; improved data visibility and integrity; and increased compliance, utilization, and collaboration;
  • The ability to leverage technologies like artificial intelligence, the blockchain, and robotic process automation (RPA) that can remove humans from repetitive or mundane tasks such as managing contracts, tracking expenditures, and assessing supplier performance; and
  • An open, cloud-based procurement platform that enables rapid innovation while actively supporting the shift from cost control and spend management to value creation and enterprise growth.

When building such a relationship, digital procurement transformation tools provide the pathway to effective and efficient procurement in today’s virtual business world. Unified source-to-pay Platforms like SMART by GEP® feature AI-based analytics and automation suitable for complex enterprise supply chains. These advanced systems also work with enterprise resource planning systems such as SAP or Oracle.


Unified Source-to-Pay Platform to Enable and Accelerate Digital Procurement Transformation.



This post is brought to you by GEP and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of GEP.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

A Path to Hybrid Cloud

$
0
0


Cloud computing is now an operational reality across every industry.  Organizations that fail to leverage this economic, operational and technology consumption model are merely consigning themselves to irrelevance.  The rapid acceleration of cloud adoption has now ignited a push for the Hybrid Cloud/HybridIT model in which enterprises simultaneously consumes information technology services from private clouds, public clouds, community clouds and traditional datacenter sources. While most see this as a reasonable evolutionary path, some see staying with a single provider or a slow, gradual transition as a more prudent path. I strongly disagree.

A casual observation of the information technology marketplace reveals that data is continuing to grow at an exponential pace. We have also moved from the management of structured data, through joint analysis of structured and unstructured data into an environment where real-time analysis and reporting of streaming data is essential. We are also in an environment of stricter data management regulations and national data sovereignty laws that, if violated, introduce the possibility of punishing remedies and fines. This rapid progression has also driven an exponential increase in required (and desired) information technology services. Cloud service providers meet this need through the innovative creation and deployment of API accessible, immediately consumable, data manipulation services. Enterprise IT organizations have shown themselves to be incapable of matching the blistering increase in number and breadth of these broader marketplace services.  It’s not cost-effective or even desirable for them to even try.

Business owners, on the other hand, see these new services as necessary competitive tools.  They can’t wait for the required internal governance processes or IT investment decisions. This tension has been the cause of internal conflict between IT and business and also the underlying cause of Shadow IT, a tendency to stealthily procure and use cloud services without internal IT knowledge or approval. The organizational business goal must be accomplished and to meet this imperative, enterprise IT must drive a radical shift from legacy ideas and culture towards embracing the Hybrid Cloud/Hybrid IT model.

Enterprise IT management must face reality.  The development and rapid execution of a business supportive IT strategy require a meaningful conversation between IT and business leaders on targeted new business opportunities and any associated differentiating business strategies.  IT leadership must then select the appropriate IT service mix and sources for each necessary business process. This multi-vendor, multi-source selection process should point to the needed Hybrid Cloud/Hybrid IT target end state. The path towards realizing that target should go through at least two pilot processes. One through which success delivers IT operational efficiency and savings and a second that promises new revenue streams for the business. Ideally executed in parallel, this approach will:
  • Train and educate your IT team on the cloud model and required business processes;
  • Build much-neededrapport and collaboration between the business team and IT team;
  • Accelerate attainment of the Hybrid Cloud/Hybrid IT target end state; and
  • Effectively move the organization down the necessary digital transformation path.


Enterprises that have been successful in completing this transformative process include:
  • CarMax-  a Fortune® 500 company with more than 175 stores across the US and over 6 million cars sold
  • IHG - one of the world’s leading hotel companies, with more than 375,000 people working across almost 100 countries to deliver True Hospitality for everyone; and
  • Smithfield Foods - the world’s largest pork processor and hog producer, committed to providing good food in a responsible way.

In completing their path to hybrid cloud, Smithfield Foods realized:
  • Application response time drop from 600ms to 70 ms;
  • No unplanned IT outages;
  • Increased visibility into business key performance indicators;
  • A transition from a reactive to a predictive decision making culture;
  • A 60% reduction in required IT resources; and
  • The desired enablement of business innovation.
To learn more about starting your company’s path towards the hybrid cloud, take a look at the MicrosoftOffice Modern Workplace episode on Hybrid Cloud.  In it, Corporate Vice President of Azure Marketing at Microsoft, Julia White, and Tim Crawford from AVOA address how organizations can build the right cloud strategy for the business and its impact on digital transformation.



This post was sponsored by Microsoft.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Human-Led Collaboration with Machines

$
0
0

When charged with managing large and complex efforts, an overarching project management task is risk assessment. It involves documenting the current situation, comparing it to the past, and understanding the odds of the past repeating itself. Since the past may never repeat itself, however, an insightful project manager also imagines the odds of any possible future outcomes.  Then the odds of past outcomes repeating themselves and the odds of new future outcome are tempered with the PM’s possible actions.  Executing this repetitive and continuous process is just one area where human-machine collaboration can change the future.

Machines do repetitive tasks well. They have perfect recall. Their forte is being able to record and document what has happened and from that, interpolate what will happen. They correlate the past and calculate the likelihood that those things will happen again. They interpolate and calculate the odds of what will happen in the future.


Humans imagine things really well. While their recollection of the past can be flawed, their creativity can be breathtaking. They intuit and sometimes see things without those things actually being there. Even with these flaws though, they can apply imagination to the whitespaces of reality and change the future. Those uniquely human capabilities need cause and structure, a skill referred to as common sense reasoning. 


Since machines, so far, have been unable to exhibit an ability to use common sense reasoning, this observation becomes the heart of human-machine collaboration. Human-machine collaboration not only support risk-assessment tasks but can also help in:
  • Resource management
  • Prediction
  • Experimentation.

By augmenting human workers with machine intelligence, the project manager can gain access to more and different analysis. More robust analysis enables more informed decisions, the anticipation of dependencies, and better leadership. Improved leadership is also why leading organizations have reshaped the use of rapid analysis, flexible organizations, and team communication tools.

Cisco Webex Teams was developed to support this shift. Focused on bridging the gap between humans and machines, it uses human priorities to plan and schedule tasks. Webex Teams can also be used to document resource levels, record resource use, and alert humans should any previously set limits be breached. Using artificial intelligence and machine learning, this collaborative tool can even provide schedule and planning option predictions.

By enabling human-machine collaboration, Cisco Webex Teams not only sets a rapid pace towards the future but delivers some of that future today by:
  • Bringing team members together more easily through advanced messaging capabilities and content sharing.
  • Enhancing productivity during team-based meetings by allowing anyone in a space to schedule, start, and record meetings that can include up to 75 video users.
  • Providing the capability to share a whiteboard application or use Cisco Webex Board’s all-in-one wireless presentation, digital whiteboarding, and video conferencing functionalities.
  • Calling team members using the app, an IP phone, or a conference-room video device.
  • Reducing meeting setup friction with integrations to streamline workflows and bots to automate additional actions.

Cisco Webex Teams enables human-led machine collaboration, a partnership in which humans set the strategy and machines execute the tactics.

This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco. 






Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Sensomorphic

$
0
0

240 million results are returned in 1.06 seconds (as of May 28, 2018) when you search for cloud computing in a Google search. With that much information available, and that many conversations active around the globe;
  • Do we really know what cloud is?
  • Are we confident in knowing what cloud can do?
  • Can we explain why the cloud is changing everything?
If 10 people were asked what cloud computing is and why it is important, we would get at least 12 different answers.
  • Where is the disconnect?
We know leaders want it. CFOs support it. Strategists recommend it. Technical teams request it. Users demand it. Isn't cloud easy? Cloud is often associated with acceleration, cost control, added flexibility, increased agility, lower complexity, and rapid innovation. It takes an incredible amount of work and planning to be simple. CIOs are stating that cloud skills are a top hiring priority in 2018.
  • What do we need to stay relevant?
  • How do we keep up with an industry that is changing every day?
Cloud computing is changing strategies and enabling innovation at every turn. Cloud is changing IT economics. Cloud is blurring the lines and breaking down traditional silos. Cloud is blending roles and redefining boundaries. Regardless of which industry we are in, or the position we hold, cloud computing is changing everything; how we work, how we play, and how we communicate.

Cloud computing is a Transformation, not a Migration.

Migration seems easy because it can be described as a series of things that get done. Migrations seem tangible: from this to that, from here to there. Transformations, interestingly, are mental and emotional. Transformations require a change in mindset. Transformations require constant data that can be continuously compared to expose insights and establish perceived value.  Migrations are planned and executed. Transformations are adopted. Without adoption, transformation fails. Adoption requires a change in mindset, often created from a continuous digestion of highly valued relevant data and insight. This means continuously sensing the environment and continuously changing your actions to better align with goals, which are also changing continuously. We, the authors, call this being:

Sensomorphic.


Businesses and people tasked with adapting and driving change must become sensomorphic. Today, many are flooded with data, yet remain uninformed. Many know they are in the wrong place, yet struggle to know where they are. The only sustainable path for positive transformation is to become sensomorphic. In the world of cloud computing, this means being sensomorphic across many domains, simultaneously. The sensomorphic domains are:

Cloud adoption is a core component of digital transformation. Organizations must align modern technology and current economic models to business strategy. Transformation requires a new approach that balances cost and technology choices with company direction and client consumption models.



Architecting Cloud ComputingSolutions presents and explains many critical Cloud solution design considerations and technology decisions required to successfully consume the right cloud service and deployment models based on strategic, economic, and technology requirements. This book starts with the fundamentals of cloud computing and its architectural concepts. It then navigates through cloud service models (IaaS, PaaS, and SaaS), deployment models (public, private, community, and hybrid), and implementation options (Enterprise, MSP, and CSP). Each section exposes and discusses key considerations and challenges that organizations face during cloud migration. In later chapters, this book dives into how to leverage DevOps, Cloud-Native, and Serverless architectures in your Cloud environment. Discussions include industry best practices for scaling your cloud environment as well as details for managing essential cloud technology service components such as data storage, security controls, and disaster recovery. By the end of this book, you will be well versed in all the design considerations and operational trades needed to adopt cloud services no matter which cloud service provider you choose.

About the authors:

Kevin L. Jacksonis a globally recognized cloud computing expert, technology thought leader, and CEO/founder of GovCloud Network, LLC. Mr. Jackson’s commercial experience includes being Vice President J.P. Morgan Chase and Worldwide Sales Executive at IBM. He has deployed mission applications to the US Intelligence Community cloud computing environment (IC ITE), and he has authored and published several cloud computing courses and books. He is a Certified Information System Security Professional (CISSP) and Certified Cloud Security Professional (CCSP).

Scott Goesslingis the COO/CTO for Burstorm, and he helped create the world’s first automated Cloud Solution Design platform. He has lived and worked in the Philippines, Japan, India, Mexico, France, and the US. Being an expert in many technologies, Scott also has been a part of several successful start-ups, including a network hardware innovator that was acquired for over $8B. Scott's perspectives combine many real-world experiences.

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Artificial Intelligence and the Project Manager

$
0
0

Organizations use teams to create wealth, market share, customer service, competitive advantage, and organizational success. Effective teams accomplish their assigned end goals by engaging in collaboration as a joint learning activity. Enhanced effectiveness is why collaborative tools are so critical to the project manager, and 7 out of 10 IT professionals see collaboration as essential to their organization.

For an information worker operating within a modern team environment, finding information is relatively easy. Any team member can “google” to find just about anything.  High-performance teams, however, know how to work together to brainstorm and collaborate on discovering the right questions. That concept frames the future of collaboration and the project manager’s role. The most effective project managers use artificial intelligence (AI) to apply computational approaches to the collaborative social experience. In laymen’s terms that means using AI to discover the right questions. Research has shown this approach as a more robust method of helping humans solve increasingly complex business problems.

As AI and collaboration technologies enhance and spread intelligence equally to any worker, machine learning technologies provide just-in-time custom learning based on team needs and the organizational goals. Collaboration technology should, therefore, also help ease the challenge of connecting physically remote teams to each other. This critical function allows more interaction, more collective learning, more collaboration, and more team success. By embracing this new remote collaboration paradigm, project managers can:
  • Identify and engage critical talent independent of their location. This capability improves the manager’s ability to bring complementary skills into a collaborative environment with the broader team;  
  • Encourage and build healthy relationships with remote team members. Strong relationshipsare the heart of effective collaboration and leadership;
  • Present and communicate a guiding vision to the team. Providing clarity of purpose enhances collaboration;
  • Work with local and remote team members to jointly prepare a clear mission objective and define group rules of engagement;
  • Connect the project with higher level organizational objectives;
  • Create an atmosphere of safety, trust, and respect that, in turn, encourages multiple perspectives, diverse viewpoints, and creativity;
  • Make everyone’s ideas and suggestions visible and tangible by building prototypes, or drawing diagrams;
  • Provide an easy-to-use infrastructure that enables learning, communication, andcollaboration;
  • Remove barriers to high performance by nurturing individual brilliance;
  • Coach for improved teamwork, emotional intelligence, and navigating difficult conversations;
  • Jointly celebrate joint accomplishments; and
  • Capture best practices and things that should be avoided.


These are all reasons why Cisco launched Webex Teams. This collaborative platform uses machine learning to present an intelligent and human-like conversational interface for any application or device. With this capability, project managers can eliminate the friction usually associated with remote team member communications. The solution has also embedded Webex Assistant, an AI-enabled service for managing directory and scheduling information and designed to assist, participate, and take action that supports the project manager. Webex Assistant leverages the powerful Webex graph to access better information faster. In doing so, it essentially injects artificial intelligence into every team interaction.



This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco. 




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more


Building A Collaborative Team

$
0
0


Recently, Harvard Business Review cited some insightful research into team behavior at 15 multinational companies. It found that although these teams tended to be large, virtual, diverse, and composed of highly educated specialists, those same four characteristics made it hard for teams to accomplish their goals. It also showed that complex team members were less likely—absent other influences—to share knowledge freely, learn from one another, shift workloads to break up bottlenecks, or help one another to complete jobs on time or share resources. In other words, to collaborate. The study also looked at teams that exhibited high levels of collaborative behavior. The difference turned out to be in the quality of team leadership.

The eight factors that led to such leadership success were:
  1. Making highly visible investments in facilities that demonstrate their commitment to collaboration.
  2. Demonstrating leadership that models collaborative behavior.
  3. Mentoring and coaching, especially informally, in ways that help people build networks across corporate boundaries.
  4. Ensuring that collaboration skills have been taught to the team.
  5. Building and supporting a strong sense of community.
  6. Assigning team leaders that are both task- and relationship-oriented.
  7. Building on heritage relationships by putting at least a few people who know one another on the team.
  8. Sharply defining team roles and individually assigned tasks.  

This observation means project managers must set an environment that nurtures the exploration of open-ended thought and interactive collaboration. To accomplish this, team interactions cannot be just a series of point-in-time activities. The traditional team meeting must be replaced with continuous interaction and relationship building. To directly address this need, Cisco created the Emerge Engineering Team and TeamTV.


The Emerge Team works to create innovative technology that accelerates the future of work. Since collaboration will be so essential to success, they created TeamTV as a means of exploring the future of collaboration. This next-generation enterprise video collaborative platform integrates with and leverages the WebEx Teams digital collaboration suite. By creating a visually immersive and continuously interactive environment, they’ve discovered the immense value of having a space to interact daily with global teammates as if they were all in one office.
In addition to having a webcam filming the participants, TeamTV provides other useful collaboration tools including:
  • The “team mode” version of TeamTV with all members on-screen;
  • A “popcorn mode” where all members can watch an event or something communally across distances;
  • TeamTV channel ticker, where team-relevant information is available across the bottom of the screen; and
  • A virtual assistant bot with facial recognition technology capable of recognizing team members and serving up relevant email and instant messages. 

Building collaboration across an enterprise is not a quick job. It requires a combination of long-term relationship building and trust, a culture where senior leaders openly exhibit cooperation and make smart near-term decisions on team formation. Legacy practices that may work well with simple, co-located teams are likely to fail when teams grow more complex. Although most factors that impede collaboration today have always been there, the modern teams that are needed to solve global business challenges require much more diversity, long-distance cooperation, and remote expertise. Project managers would, therefore, do well to update their approach to today’s business challenges by addressing the eight factors listed above.  


This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco. 




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Welcome the New Project Manager!

$
0
0

According to CIO.com, the six traits of highly effective project managers are:
  1. Be a strategic businesspartner who can offer higher-level strategic leadership skills, not just technical management skills, provide significant advantages for organizations of all sizes.
  2. Encourage and recognize valuable contributions because a project leader’s effectiveness is strongly impacted by the contributions of others on his or her team.
  3. Respect and motivate stakeholders using an ability to communicate with and influence a variety of stakeholders. You must demonstrate respect for team members, stakeholders, and sponsors at all times if you are to receive their respect in turn.
  4. Be fully vested in success and believe in the work you are doing and be completely involved in all professional aspects of the project, its activities, and its people.
  5. Stress integrity and accountability. Being accountable for your decisions and actions is vital, and sends a strong message to the rest of the team.
  6. Be able to work in the gray because this is what truly sets a project manager apart. Thisis a must-have skill since the majority of projects, regardless of type, industry, size, or complexity, will havegray areas that need to be navigated at some point.


A vital component of all of these straits is an ability to communicate both up the chain to superiors and down the chain to your team. In short, successful project management is about successful teamwork.  Teamwork starts with the project manager recognizing that “Job #1” is knowing the people and blending their styles.  This task can be very challenging given the broad societal demographics and cultural variations.  Just in looking at the different generations that may exist in a team, work ethic and values across multiple generations must be addressed from the very beginning of a project. 

Figure 1- Workplace Characteristic Comparative

For a project manager, this challenge often manifests itself through inordinate amounts of time spent on administrative tasks and poor or unproductive meetings.  These symptoms may also lead to the perception of failure or professional stagnation within the team.


Graphic Courtesy Instapage
To avoid this trap, managers should focus on team enablement that also respects personal differences and goals. This path values:
  • Creativitythrough the use of office spaces optimized for focusing, creating, and collaboration
  • Productivitythrough the use of secure, reliable access to essential tools and information, regardless of location or device; and
  • Satisfactionthrough the recognition and celebration of different goals and value frameworks

Good project managers can also discover and create business value by eliminating the need for physical proximity while simultaneously embracing the importance of human connection. Tools like WebEx Teams accomplish this by taking the pain out of both physical and virtual meetings through the use of intuitive voice interaction and collaborative features no matter where your team members may be. These capabilities also make it easier for distributed teams to exchange ideas and collaborate through shared digital whiteboards and chat. This approach addresses modern workers’ ability to work from wherever they can contribute the most value.

By automating mundane meeting components and optimizing the mobile experience of remote  team members, the exceptional project manager reinvents project management by taking advantage of today’s advance communication channels. This will, in turn, create unprecedented value for the team and the entire organization


This post is brought to you by Cisco and IDG. The views and opinions expressed herein are those of the author and do not necessarily represent the views and opinions of Cisco.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Cloud Migration Part One: An Overview

$
0
0


Cloud Migration Part One: An Overview

Business is all about efficiency and effectiveness.  In today’s world, however, those twin goals almost always lead to cloud migration.  This anecdotal observation is supported by Gartner which sees worldwide public cloud service revenue jumping to over $300B by 2021.


Independent research from Market and Markets echoes this expectation in its cloud migration services forecast which sees this market subset growing from $3.17B in 2017 to $9.47B by 2022, at a Compound Annual Growth Rate (CAGR) of 24.5%.  With migration being such a high priority activity, many organizations are looking for the most efficient and effective cloud migration strategy.
In addressing this query from thousands of customers worldwide, IBM Global Technology Services (GTS) has migrated applications in just about every industry.  These migrations have targeted global service providers like AWS and Azure, as well as regional and local ones.  The best practices GTS has honed through these experiences include:
  • How to understand and classify business critical data;
  • Executing an efficient process for screening and selecting applications for cloud migration;
  • Following a methodology for discovering the most effective strategy for each application migration; and
  •  Selection of the most cost-effective and industry aligned cloud service provider(s).
Experience has also shown that businesses are in different stages of their “Journey to the Cloud.”  These initial stages often include:
  • Planning and designing common foundational infrastructure services;
  • Pattern and Template based automated deployments for public clouds;
  • Migrating workloads to the most appropriate cloud through a standardized, repeatable tool driven framework;
  • Monitor and Manage workloads using standardized tools and process aligned to cloud platforms; and
  • Governing, tracking, managing and optimizing cloud usage and spend.


These common best practices and initial stages are common to the most successful cloud migration projects.

This series, presented in four weekly installments, lays out the details of how leading organizations have transformed themselves through cloud migration and how GTS has embedded industry best practices into their hybrid cloud service delivery model.  “Part Two: Classifying Organizational Data,” covers the identification of key business processes and their associated data types.  The article also the outlines the importance of identifying process data owners and the required security controls for each data type.  “Part Three: Application Screening,”looks at determining the most appropriate target deployment environment, each application’s business benefit, key performance indicator options and target return on investment.  That segment also shows how to select the most appropriate migration strategy for each application.  “Part Four: Executing The Migration” presents experience informed guidance on how to effectively and efficiently execute a cloud application migration strategy.  This segment includes selecting the most appropriate cloud service provider and technology services, reviewing and verifying available data security controls and suggested steps for SLA negotiations.  It also addresses business/mission model alignment, organizational change management, and migration project planning.

The series also presents the three most common cloud adoption paths for business, namely:
  • Innovation:Building cloud-native applications using the DevOps model;
  • Agility:Modernizing and migrating legacy applications and infrastructure to a native cloud model; and
  • Stability:Managing workloads and infrastructure in clouds and on premises




This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Skills Gulf Is Cloud’s Biggest Peril

$
0
0


Ian Moyse, Cloud Industry Thought Leader & Cloud Sales Director at Natterbox

Cloud is undoubtedly the driver of the new tech economy. Be it SaaS, PaaS, IaaS, Public, Private or Hybrid clouds, E-Commerce, IOT (Internet of Things), Big Data or some iteration that at the back of it is supported by cloud technologies. Technology is both enhancing and reducing in cost at such a speed, that it is no longer the entitlement of only the large firms, but can empower any organisation from small to large, from startup to established, to be able to revolutionise their customer offering and to elect to disrupt or be disrupted.

With this speed of technology change comes a need for those supporting the business to adapt quickly and adopt new methodologies, knowledge, and skills to empower a company to take advantage of these new possibilities. Switching from Waterfall to Agile, from networking to virtualisation to Docker, from hosting to IaaS & PaaS and from C, through Java into Swift, Hack, and Dart.

A wide range of firms still relies on traditional IT infrastructure (locally deployed server applications and databases) despite the increasingly rapid rate of companies migrating to cloud-based systems.  Digital Transformation seems to be on the agenda of most Enterprise organisations, banded about as if it’s a switch to flick and a fast thing to undertake. However, the reality is far from the truth and accepting the change required and having the skills at hand to achieve it, are barriers impeding a growing number of companies.

Change is hard to accept at the best of times, particularly if you have previously been the subject expert on a vendor/technology for a long period, to now find that is being disrupted at pace and your worth is diminishing either in your own firm or to the general market. Being prepared to let go of many years of acquired skills and accept the need to re-start and learn a whole range of new skills is hard to accept, and many will resist, defending the status quo and hindering business change and their own personal progress.

For companies moving applications and services to cloud platforms, migration challenges are one of the top constraints affecting IT, as there are no automated switchovers on offer and customised internal or external migrations vary from mild to heavy development changes. For example, migrating a home grown or proprietary application requires new application development and testing. However, if taken on with commitment, the move can provide faster more agile application development through DevOps and utilisation of enhanced cloud features and API’s leading to improved application lifecycle management.

However, with this comes the need for professionals with the skills and knowledge of the chosen cloud platform to deliver the migration project in a structured, and effective manner. Cloud continues to enhance quickly and even those in the cloud a decade ago are finding they are needing to continue to learn new skills, such as the usage surge in containers, for which a Robin Systems Survey recently cited that 81% of organisations are planning to increase their use.

Big Data has introduced new approaches, tools, skills and with an expected 60% per annum growth (IDC) cannot be ignored. With the increased volume of data and continual crunching demands databases are going to live in the cloud and demand new platforms and approaches.

With the plethora of changes from new coded applications and architectures holding vast data stores in the cloud, the need for greater cyber security expertise is an essential requirement. With the human element recognised as the most vulnerable area of security, the introduction of so many new skill areas will introduce increased risk of new security exposures. Software developers in the cloud must understand and treat with extreme caution, the need for increased responsibility for security assurance and compliance. With the heightened awareness of security threats and breaches and the introduction of the new GDPR (General Data Protection Regulation) in Europe with far heftier and damaging fines, getting this wrong is now going to be catastrophic. It is estimated that less than 5% of cloud applications are ready for GDPR, leading to a vast breadth of enhancement In a very short period.

The perfect storm circling this comes from the expectation that 30-40% of the corporate workforce will retire in the next decade, combined with a reduction in those studying relevant ICT subjects and the reduction in educations capability to provide effective education in the required areas. We have a rapidly increasing need for new technology skills (to both support new technologies and to support digital transformation from old to new) and a big % of those with technology backgrounds retiring rather than reskilling, backed a reduction in educations capability to attract and educate to the level of need required.

Businesses now have pressures upon them like never before!  Markets that shift quicker, more fickle and demanding customers, users being influenced by or becoming millennials (who expect faster, quicker, easier, cheaper from the world they have grown up within) and disruption all around them from new born firms who can attack with the gusto of using all the new world tech and methods, with no legacies to unchain themselves from.

Companies MUST have access to the skills required to be able to employ the full scope of new tech on offer to their business advantage. To be able to move old creaking applications to newer form factors and to deliver a better quality of service and user experience to the demands of any device, any place, any time working for both their employee and their increasingly new breed of customer.
Unless the issue is addressed quickly,  you can expect ‘Supply & Demand’ for these new skills is going to simultaneously implode and explode, creating a chasm between need and affordability, as those who can do become scarce and valuable commodities, available to the few who can afford!


You can follow Ian at www.ianmoyse.cloud

( This content is being syndicated through multiple channels. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners or any other corporation or organization.)





Cloud Musings

(Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Cloud Migration Best Practice: Classifying Your Data

$
0
0


In my first post of this series, “CloudMigration Part One: An Overview,” I provided a high-level summary of how enterprises should migrate applications to the cloud. In this installment, the focus is on enterprise data and why your organization may need to review and reclassify its data before moving anything to the cloud.

Cloud computing has done more than change the way enterprises consume information technology.  It is also changing how organizations need to protect their data.  Some may see this as an “unintended consequence” but the headlong rush to save money by migrating applications to the cloud has simultaneously uncovered long-hidden application security issues.  This revelation is mostly due to the wide adoption of “Lift & Shift” as a cloud migration strategy.  Using this option typically precludes any modifications of the migrating application.  It can also result in the elimination of essential data security controls and lead to grave data breaches.

While there is no doubt in the good intentions of all involved, traditionally, enterprise applications were developed for deployment into the organization’s own IT infrastructure.  This implicit assumption also included the use of infrastructure-based security controls to protect organizational data.  These generally accepted industry practices were coupled with a cultural propensity to err on the side of caution by protecting most data at generally high levels.  During an implementation, organizations typically used a two-level (sensitive and non-sensitive) or at most a four-level data classification model.

Today, the cloud has quickly become the preferred deployment environment for enterprise applications.  This shift to using “other people’s infrastructure” has brought with it tremendous variability in the nature and quality of infrastructure-based data security controls.  It is also forcing companies to shift away from infrastructure-centric security to data-centric information security models.  Expanding international electronic commerce, ever tightening national data sovereignty laws and regional data protection and privacy regulations (i.e., GDPR) have also combined to make many data classification schemas generally untenable.  Cloud Security Alliance and the International Information Systems Security Certification Consortium (ISC2), in fact, both suggest that corporate data may need to be classified across at least eight categories, namely:
  • Data type (format, structure)
  • Jurisdiction and other legal constraints
  • Context
  • Ownership
  • Contractual or business constraints
  • Trust levels and source of origin
  • Value, sensitivity, and criticality
  • The obligationfor retention and preservation

Moving to classify data at this level means that one of the most important initial steps of any cloud computing migration must be a review and possible reclassification of all organizational data.  In bypassing this step, newly migrated applications simply become data breaches in wait.  At a minimum an enterprise should:
  • Document all key business processes destined for cloud migration;
  • Identify all data types associated with each migrating business process;
  • Explicitly assign the role of “Process Data Owner” to appropriate individuals; and
  • Assign each “Process Data Owner” the task of setting and documenting the minimum required security controls for each data type.

After completing these steps, companies should review and update their IT governance process to reflect any required expansion of their corporate data classification model.  These steps are also aligned with ISO 27034-1 framework for implementing cloud application security.  This standard explicitly takes a process approach to specifying, designing, developing, testing, implementing and maintaining security functions and controls in application systems.  It defines application security not as the state of security of an application system (the results of the process) but as “a process an organization can perform for applying controls and measurements to its applications in order to manage the risk of using them.”

In Part 3 of this series, I will discuss application screening and related industry best practices and include:
  • Determining the most appropriate target application deployment environment;
  • Determining each application's business value, key performance indicators and target return on investment;
  • Determining each application's migration readiness; and
  • Deciding the appropriate application migration strategy.



This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Cloud Migration Best Practice Part 3: Application Portfolio Analysis

$
0
0

In part three of this series on cloud migration best practice, I will focus on migrating the application itself. If you haven’t had the opportunity to read our recommendations from part two, “Classifying Your Data,” check it out — those activities are crucial to the decisions addressed in this installment.
While many organizations are aggressively moving applications to the cloud, they often set the criteria for a cloud service provider (CSP) without the necessary technical and operational due diligence. This widely observed error typically leads to migration delays, failures to attain expected business goals and general disillusionment with cloud computing. However, avoiding this disappointing experience is relatively easy. All it takes is executing an application portfolio screening process that takes a look at:
  • The most appropriate CSP target deployment environment.
  • Each application’s specific business benefits, key performance metrics and target return on investment.
  • Each application’s readiness for cloud migration.

Build a foundation

The first step in the screening process is determining the most appropriate cloud deployment environment. This practice establishes an operational foundation for subsequent service provider selections by using relevant stakeholder goals and organizational constraints to guide service model, deployment model and implementation option strategy decisions. Enterprises transforming their information technology should evaluate all available options by analyzing an app transition across three specific high-level domains and sub-domains, such as:
  • IT implementation model
    • Traditional
    • Managed service provider
    • Cloud service provider
  • Technology service model
    • Infrastructure-as-a-Service
    • Platform-as-a-Service
    • Software-as-a-Service
  • IT infrastructure deployment model
    • Private
    • Hybrid
    • Community
    • Public

Cloud computing domains

These domains and sub-domains outline a structured decision process for placing the right application workload into the most appropriate IT environment. This is not a static decision: As business goals, technology options and economic models changes, the relative value of these combinations to your organization may change as well. Plus, single-point solutions are rarely sufficient to meet all enterprise needs. By the end of the cloud migration journey, an organization may require a mix of two, three or as many as 10 variations. Infrastructure variation is why an organizational hybrid IT adoption strategy is crucial. Figure 1 is an example application decision matrix suitable for this step.


With target deployment environments selected, companies should evaluate each candidate application regarding their business benefits and ability to leverage cloud computing’s technical and operational advantages. Using a simple qualitative scale, stakeholders should agree on:
  • Key performance indicators relevant to business or mission owner goals.
  • Expected or target financial return on investment.
  • Each application’s ability to use cloud infrastructure scalability to:
    • Optimize time to deliver products or services.
    • Reduce time from business decision to execution.
    • Optimize cost associated with IT resource capacity.
    • Increase speed of cost reduction.
  • Possible application performance improvements that may include:
    • More predictable deployment and operational costs.
    • Improved resource utilization.
    • Quantifiable service level metrics.
  • Value delivered by improved user availability that may be indicated by:
    • Improved customer experience.
    • Implementation of intelligent automation.
    • Improved revenue margin.
    • Enhanced market disruption.
  • Enhancing application reliability by:
    • Establishing enforceable service level agreements.
    • Increasing revenue efficiencies.
    • Optimizing profit margin.

Determine KPIs

Figure 2 provides a baseline KPI and ROI model that can be easily modified to effectively manage a qualitative assessment across time, cost, quality and revenue margin criteria.


The final step of this application screening process is determining each application’s readiness to actually migrate to the cloud. This step should qualitatively assess the alignment of an application’s cloud migration decision to the organization’s:
  • Risk appetite and risk mitigation options.
  • Ability to implement, manage and monitor data security controls.
  • Expected migration timelines.
  • Expected ROI realization timelines.
  • Current culture and necessary organizational change management resources.
Performing an application portfolio screening process can be useful in aligning cloud application migration projects with organizational business, technical, security and operational goals. It can also avoid application migration delays, failed business goals and team disillusionment by building and monitoring stakeholder consensus.
In the next and final installment of this series, data classification and application screening are linked to cloud service providerselection and application migration execution.


This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Cloud migration best practice Part 4: Executing the migration

$
0
0

This series has stepped through cloud migration best practices. After providing an overview, we discussed:
With all of that completed, it’s now time to select the right cloud service provider (CSP) and finally execute the migration. Cloud provider selection is an area that many enterprises ignore. Executives looking to take advantage of the real business value that the cloud delivers often view providers simply as commodity technology providers. With this mindset, decision-makers usually pick the most familiar name. But this strategy is little more than throwing the dice.

A Smarter Way to Select a Provider

Cloud service provider selection requires a well-developed hybrid IT strategy, an unbiased application portfolio review and the appropriate due diligence in the evaluation of all credible cloud service providers. When discussing this linkage, I leverage the Digital Transformation Layered Triangle as a visualization tool. After agreeing to an appropriate high-level hybrid IT strategy, a digital transformation core tenant, candidate CSPs capabilities must be compared based on their:
  • Availability of technology services that align with the business/mission model.
  • Availability of data security controls that address legal, regulatory and data sovereignty limitations.
  • Compatibility of CSP sales process with enterprise acquisition process.
  • Cost forecast alignment with budgetary expectations.

Understanding Cloud Service Agreements

Comparing cloud service agreements from the remaining viable service providers is next. These agreements typically have three components:
  • Customer Agreement: Describes the overall relationship between the customer and provider. Service management includes the processes and procedures used by the cloud provider. Thus, it’s crucial to provide definitions of the roles, responsibilities and execution of the processes. The customer agreement does this. This document can be called a “master agreement,” “terms of service” or simply “agreement.”
  • Acceptable Use Policy (AUP): Defines activities that the provider considers to be improper or outright illegal. There is considerable consistency across cloud providers in these documents. While specific details may vary, the scope and effect of these policies remain the same, and these provisions typically generate the least concerns or resistance.
  • Service-Level Agreement (SLA): Describes levels of service by in terms of availability, serviceability or performance. The SLA specifies thresholds and financial penalties associated with violations of these thresholds. Well-designed SLAs can avoid conflict and facilitate the resolution of an issue before it escalates into a dispute.

Designing a CSA Evaluation

The CSA Evaluation must take into account all critical functional and nonfunctional organizational requirements and IT governance policies, to ensure:
  • Mutual understanding of roles and responsibilities.
  • Compatibility with all enterprise business level policies.
  • An identifiable metrics for all critical performance objectives.
  • Agreement on a plan for meeting all data security and privacy requirements.
  • Identified service management points of contact for each critical technology services.
  • Agreement on service failure management process.
  • Agreement on disaster recovery plan process.
  • An approved hybrid IT governance process.
  • Agreement on a CSP exit process.
This due diligence process maximizes the success probability of any cloud migration program. With CSP selection complete, the organization can now tackle the hard work of executing the actual migration. This task should include:
  • Planning and executing an organizational change management plan.
  • Verifying and clarifying all key stakeholder roles.
  • Detailed project planning and execution.
  • Establishing internal processes for monitoring and periodically reporting the status of all key performance indicators.
  • Establishing an internal cloud migration status feedback and response process.
The most important lesson learned across all industries is that cloud migration is not a project for the IT team alone. This is an enterprise-wide endeavor that requires executive leadership and focused change management efforts across multiple internal domains.


This post was brought to you by IBM Global Technology Services. For more content like this, visit ITBizAdvisor.



Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more


6 Reasons Why Unity is the Best Game Engine Out There!

$
0
0



What makes Unity the most popular game engine? 

Find out below.





The game development space is overflowing with game engines offering capabilities for a diverse range of requirements. There’s the Unreal Engine for high-end sophisticated games, the advanced and feature-packed Godot, the mighty powerful CryEngine,which comes with full-engine source code and zero royalties,and Marmalade SDK, which promises maximum exposure for the games you develop.
However, with over 700 million gamers worldwide, Unity dominates the game engine market. Offering immersive graphics and powerful features, Unity is a platform for artists, designers and developers to collaborate and create stunning 2D and 3D gameplay sequences. Fair enough I hear you say, but why is Unity so popular? Let’s find out.

1.      Unity is free for all


Unity’s motto is to democratize game development; in fact its Personal edition is free to use and download. It is also fullof featuresso that independent developers won’t have to miss out on any functionality to develop immersive games. What’s more, you can create a full Unity game for free in the Personal edition and sell it. You don’t have to spend a dime onsoftware unless you are making more than $100,000 per year by selling games made on Unity, in which case, you’ll have to upgrade to the Plus edition (for a mere$35/month, as of May 2018).

2.      Unity offers stunning realism


The latest tools of Unity are setting industry standards in realism. Its Physically Based Rendering feature along with global illumination and real-time compositing allow incredible detailing, yielding awesome and realistic graphics. In fact, the realism capabilities are so powerful that Unity can be used for tasks other than game development, such as creating interactive product catalogs and realistic visualization walk-throughs.

3.      Programming in Unity is trouble-free


Unity supports C# and UnityScript (a specialized variation of JavaScript made for Unity),so anyone with a background in either language can easily find their way around. An added bonus is that if programming is not your cup of tea, Unity has a plethora of visual scripting tools that enable you to create your own scripts and apply them to any game object. Plus, you can also check out Unity’s vast library of scripts for a multitude of game play mechanics that can make your job easier. 

4.      Unity is platform-agnostic


 It’s so important for developers to get their game running on multiple platforms. Thankfully, Unity,including iOS, Android and PC consoles, and the number of platforms are constantly increasing. The best part, however, is that you’ll need to make little to no changes to your workflow as Unity is quite flexible in this respect. All it takes is clicking a few buttons, and your game is ready to be played across multiple platforms. Plus, Unity supports Oculus Rift, HTC Vive, Microsoft Hololens, and several other VR systems, so you don’t have to limit your creativity.


5.      Unity’s very own Asset Store


Unity’s expansive Asset Store is its biggest asset. AI systems, 3D models, animations, complete projects, shaders, or audio—you name it and Unity has it! Want to get your dream game out of your head and into the hands of millions of gamers out there? No worries; just browse through the store and get access to myriad options for your own project. What’s more, you could also sell the assets you’ve created for a handsome price.

6.      Unity offers an entire suite of services


Unity Services is a new set of features that make building, sharing and selling games a lot more interesting and fun. Tools like Unity Cloud and Unity Collaborate allow backing up the entire game and building alternate versions without affecting the system. They also give you the freedom to jump back to an earlier version if things don’t seem to be going your way. Unity Services also has tools for analytics, ads, performance, multiplayer, and more, so you can get an in-depth look of how your game is performing and where users might encounter issues.




Unity 2017 Game Development Essentials


If you are an animator or a designer who’s taking baby steps in the world of game development, Unity is the most obvious choice of game engine. And when it comes to learning the essentials of game development on Unity, we have just the right solution for you. Unity 2017 Game Development Essentials is an end-to-end exercise in game development, covering environments, physics, sound, particles, and otherkey concepts to get you up and running. You’ll learn scripting games using C#, build your very first 2D and 3D games, create fully functional menus and HUDs, and much more.

The book is written by Tommaso Lintrami, an expert game developer who’s been building games since the age of 9. Tommaso is a man of many talents—he is a designer, developer, composer and writer. He has been working with Unity for over 9 years now, having developed a number of games on different platforms. Tomasso brings this expertise to the book, taking you through game development in the most fun and interactive way imaginable.


So what are you waiting for? Check out Unity 2017 Game Development Essentials and get ready to make a mark in the gaming industry!



( This sponsored post is part of a series designed to highlight recently published Packt books about leading technologies and software applications. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners.)





Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

What Is The Most Important Part of Architecture?

$
0
0

I always find it interesting to hear what people view architecture as. A lot of people think it’s just about the design aspect, where you get to put pen to paper and create a solution. Even more people think that it’s just about putting together different technical components in a server room. And these people have interesting opinions on the importance of those activities to architecture. But, at the end of the day, the MOST important part of architecture is one thing and one thing only: requirements.
Requirements
Without requirements, you have no idea if you are actually designing a solution that matters. Without requirements, you have no way of knowing if those technical components that you are including on the server rack will actually be used. In short, you are only spending money without knowing if it’s worthwhile.
We all know of solutions that have been put into place and yet no one uses them. Why is that? Well, one very simple reason – no one bothered to check with the stakeholders what exactly they wanted. What’s the point of spending money on all those components if no one is going to use what you put together? That’s why you gather requirements so that you don’t waste money and actually have a usable solution. Not a solution that works but one that is actually used.
When you gather requirements, you don’t just sit down at a desk and dream up what you think the solution should meet. That’s just navel gazing and it’s no better than designing or building without requirements. Requirement gathering is all about talking to stakeholders to understand what they want and need. You gather those requirements and only then do you start looking for a design approach.
Now, when you say stakeholders, what do you mean? Well, remember that stakeholders include everyone that has a stake in how a solution works. So, it’s not just the end users that are interfacing with the solution or just the business owner who is providing the money. It’s also the operations folks that are supporting the solution. Remember, if the operations team can’t properly support a solution or would need to spend extra money to support it, then you have a more expensive solution than you may have wanted in the first place. So make sure you talk to the operations people about what they need in a solution as well.
Now, you’ve identified the stakeholders that you want to talk to and you are now scheduling meetings to gather those requirements. How do you do that? I would highly recommend that you don’t talk to them all in one room at the same time. There is always the proverbial ‘wallflower’ that sits in the back and doesn’t say anything but who will have a very valid point about a requirement. You will have domineering personalities that will want to be the focus of the meeting. And there will be people that lose focus during the meeting and do other things.
Instead, schedule one-on-one sessions with every stakeholder. A good requirement gathering session will average to 45 minutes per person, so schedule an hour for each person. Trust me; it may seem like you are spending a lot of time on this but it will save you a lot of money over the longer term if you do things correctly from the start.
Now, you’ve scheduled your session with your stakeholders. How do you conduct the meeting? Well, first off, treat it like you would an audit. You don’t go in with preconceived ideas of what the solution is. What you do is ask your stakeholder very broad, open-ended questions and let them talk. Don’t show any indication on how you feel about a particular requirement that they bring up, just note it down. I would highly recommend that you have a spreadsheet for all the different requirements areas (for example, availability, security, maintenance, usability, etc.) so that you don’t forget to ask about them. And then just let the stakeholder talk and go in whatever direction they want to go in.
Once you’ve interviewed all the stakeholders, consolidate all the requirements and replay them back to the stakeholders as a whole. This is the time that you’ll want to have all the stakeholders in one room. You want them to see what the requirements are and agree to them before moving on. And you are bound to have conflicting requirements that will need to be hashed out between the stakeholders and reach mutual agreement.
Once the stakeholders have agreed to the requirements, you can now start going down the road of designing and building your solution. But ALWAYS refer back to the requirements at every phase. Don’t just gather the requirements and forget about them. Those requirements drive the success of the project, and the closer your end solution is to those requirements the more successful and used the project will be.
Oh, one more thing. There are always requirements that come up AFTER the gathering phase. If that happens, two things have to be kept in mind. First, it means that you didn’t do a good job at collecting the requirements in the first place and you need to figure out a way of improving your requirement gathering process. Second, accepting new requirements at this stage means going back and changing designs or builds, which costs time and money. Often, it’s better to just leave the new requirement for the next phase of the project rather than going back and reworking your design.
Requirements are the flesh and blood of a good solution, regardless of whether you are talking about security, infrastructure, application, or a network solution. And if you do it properly, your requirements can help make you a very successful architect moving forward.

If you found this article interesting and want to learn more about architecture and cybersecurity, you can explore Hands-On Cybersecurity for Architects. The book follows a clear, concise, and straightforward approach to explain the underlying concepts and enable you to develop, manage and securely architect solutions for your infrastructure.  
( This sponsored post is part of a series designed to highlight recently published Packt books about leading technologies and software applications. The opinions expressed are solely those of the author and do not represent the views of GovCloud Network, GovCloud Network Partners.)




Cloud Musings
( Thank you. If you enjoyed this article, get free updates by email or RSS - © Copyright Kevin L. Jackson 2016-2018)



read more

Viewing all 477 articles
Browse latest View live