Alfresco News

Subscribe to Alfresco News feed
News, views, commentary on Alfresco


Updated: 56 min 48 sec ago

Use Cases for Content in Process Applications

This third BPM blog post in our guest series by Sandy Kemsley explores five primary use cases for combining content and process.  Bringing to light document-driven processes, case management, document lifecycle processes, exception handling, and non-document classification processes, Sandy reveals why it’s important for businesses to consider an integrated approach rather than keeping content and process siloed.

Although I’ve spent the last couple of decades focused on process, my roots are in that most basic overlap of digital content and process: the “imaging and workflow” systems of the 1980s and 1990s, where a paper document captured by a scanner would trigger a process that routed the document image to a person for manual processing. Process automation has become a lot more sophisticated since then, and is often driven purely by data rather than documents, but I still see a lot of applications where unstructured content is an essential part of a process.

Here are some of the main use cases for content in the context of a process:

  • Document-driven processes. This is the closest to those imaging and workflow systems of old, where the creation or capture of a document initiates a process, and the purpose of the process is to perform actions related to that document. In most cases, the document itself is unchangeable once created: think of it as the instructions for the process. An example is a loan application, where a customer fills out a form at the bank (or online), and that triggers an approval process inside the bank. The process may execute steps to capture the application data from paper, request more information, or escalate to a senior loans officer before the final steps of making a decision on the loan and issuing the funds. The original loan application is never changed, and there is typically a regulatory requirement to retain it for a specific period of time.
  • Case management. A cross between document-driven and data-driven, case management focuses on resolving cases rather than executing a single process, and gathers all content related to that resolution in an electronic “case folder”. The case folder and its contents, which includes a record of the processes and decisions applied to complete the case, is managed as per corporate content retention management policy.
  • Document lifecycle processes. Documents created inside an organization, rather than those submitted by customers, may go through an approval process, and may be continuously revised over their lifetime. The lifecycle process – create, review, approve, publish – exists purely for the purpose of creating the content. An example is a product manual, where a technical writer writes a document describing the product usage, which is then circulated to other writers and graphic artists to gather additional content, then to other parties for branding, legal review and final approval. The final version is published on the customer-facing website, and printed for packaging with the physical product. When a new version of the product comes out, the product manual goes through the same revision and approval cycle, ensuring that the content is accurate and published in a timely manner.
  • Support documentation for exceptions in data-driven processes. Arguably, most of the automated processes today are data-driven rather than document-driven or document lifecycle: straight-through processes such as financial transactions that don’t require human intervention. Although unstructured content is not required for most process instances, additional information may be required for exception handling. An example is a dental insurance claim: normally, the claim for a standard procedure can be submitted online, approved automatically and paid without human intervention, but if the claim exceeds a pre-set limit, it is escalated to a claims adjudicator for review. The adjudicator may request additional documentation from the dentist or patient, which is reviewed manually and attached to the claim process instance in order to justify the claim.
  • Classification and analysis processes for non-document content. The previous use cases primarily address unstructured content that we think of as “documents”, that is, the electronic version of a page of text. Unstructured data can be much more complex than that: blobs of semi-structured data such as IoT machine data or log files, photographs, audio and video. As this content is ingested into content management systems, processes that classify, recognize and perform analysis on the data may run immediately or be triggered at a future point.

Content and processes don’t always live on the same systems, in spite of the many use cases of their interaction. Consider that the goal of most content management is long-term access and retention management that may span years, while the goal of most process management a more ephemeral set of tasks that have only the lifespan of a process instance. Although most business process management (BPM) systems provide rudimentary capabilities for storing content, most organizations prefer to store content that requires access management and retention management in their enterprise content management (ECM) system, then connect it to a BPM process instance when required. Many ECM systems have document lifecycle processes built in, and although they are not capable of handling the other use cases discussed above, they are sometimes pressed into service as a simple BPM system.

Where I am seeing an interesting hybrid of process and content is in low-code platforms that are being used by “citizen developers” and professional IT staff to create case management-style applications. These platforms contain content and process components – in fact, they are often built on top of existing BPM and ECM systems – plus analytics, decision management, event management, predictions, machine learning and user experience. The platforms provide the ability to rapidly assemble these building blocks into custom web applications, with the BPM, ECM and other services interacting seamlessly.

Over the years, I’ve learned two things about integrating process and content: first, almost every process application has some sort of content associated with it; and second, most process-centric developers underestimate the potential complexity of handling the content in the context of the process application. You can use the scenarios above to identify the style of your process/content application, but will also need to consider how the content gets into the process, who can view and update it, whether it needs to be accessed from outside the process context, and its eventual retention and disposal.

Share with us the ways you are integrating your business process and content to improve customer experiences.  Join the conversation on Twitter, @skemsley@alfresco

Alfresco in the News – DX building blocks, the future of Open, and news from our customers!

This week’s news coverage brings to light relevant topics such as the evolution of open source technologies; new strategic opportunities for Alfresco; as well as mentions from our customers who continue to use Alfresco to enable their digital transformation:

Information Age: Accelerating digital transformation with effective data infrastructure
Director of Business Solutions, George Parapadakis identifies the foundation necessary to create for companies who want to accelerate Digital Transformation.

ZDNet: Internet of Things projects: Why your business needs to build a digital double 
McDermott International, CIO Akash Khurana talks about the importance of building a “digital twin,” and identifies Alfresco as their technology partner who helps enable it.

Channel Partners: Private Equity Firm to Buy Alfresco Software 
Alfresco CMO, Sydney Sloan shares the strong business opportunities to be gained from the Alfresco/THL acquisition.

IDG Connect: Alfresco Founder: Commercial Open Source is more than Old Stuff for Free
Riding off the success of this year’s DevCon, this feature talks about the history and future of open source technologies – and the opportunities it presents for modern companies. Alfresco innovators, John Newton, Thomas DeMeo and Doug Johnson lend their perspective in this piece.

We encourage you to share your thoughts with us on Twitter or Linked:in and continue the conversation.

CIO Conversation: How Liberty Mutual is Accelerating Delivery and Saving Millions with Alfresco on AWS

More and more companies are saving time and money by managing content in the cloud. I recently spoke with Mojgan Lefebvre, SVP & CIO of Liberty Mutual’s Global Specialty business, about how they are using Alfresco Content and Governance Services on AWS to go faster and paper-free.

Insurance is such a document-intensive business. What challenges did you face?

When we started Global Specialty, we wanted to create global productivity teams that could collaborate and leverage our scale in a digital manner. At the time, we had some tactical document management solutions, but they were siloed and outdated. There were a lot of performance issues and, frankly, a lot of end user complaints. Our data centers were located in the U.S., which made document access even slower for people thousands of miles away.

What were some of the key requirements for your new content management solution?

Cloud-native capabilities were a critical requirement because our objective was to deliver global document management capabilities that would be effective locally, whether the end user was in Asia Pacific, Europe, Latin America, or North America. We also wanted to minimize delivery time, so the strong integration capabilities of CMIS and REST APIs were important. Having to re-write our core insurance applications was a non-starter.

How did you quantify the financial impact of the solution, and what made Alfresco the right choice?

Alfresco was the only solution that absolutely met our requirement for a cloud-native platform that was compatible with AWS, which was our preferred cloud vendor. Alfresco also had very strong references and a service organization that ensured we were getting the attention we needed as a Fortune 75 company.

For our first use case, we calculated around $21 million in savings over five years from lower paper, printing, and document storage costs. This number was enough for us to say, “Hey, let’s go forward with the Alfresco solution.”

What other benefits has Liberty Mutual experienced with Alfresco on AWS? 

This is the first time any group at Liberty Mutual has automated the build and deployment of an application to the cloud. We’ve taken a very modern approach that includes using Infrastructure as Code automation and continuous integration and continuous delivery (CI/CD) pipelines to help us reduce time to market. We can deploy a fully functional Alfresco environment onto AWS in just 30 minutes.

What advice do you have for companies starting their journey to cloud-based content management?

Don’t underestimate the amount of time required for regulatory discussions. Make sure your data privacy and compliance teams are comfortable with any production solution with real customer data in the cloud. We used this initiative to put our records retention policies in place and ensure compliance around the globe. As regulations like GDPR take effect, this has absolutely been a lifesaver for us.

What’s next for your Alfresco ECM solution?

Our first implementation in oil and gas was absolutely a hit with end users. We’ve started to extend the capability into other parts of our underwriting and claims organizations as well as other business units across Liberty Mutual. We’ve already taken 500,000 documents into the Alfresco on AWS environment, and we expect that number will grow to about 300 million documents by the end of 2018.

Mojgan appeared on our recent webinar on How to Quantify the Strategic Impact of Migrating Your Content to AWS, which provided clear insights into how our customers realize value from deploying Alfresco in the Cloud. Coupled with the Liberty Mutual example, we also invited experts from Forrester Consulting to underscore the value of Alfresco by reviewing results of the recent Total Economic Impact Study of Alfresco Content Services on AWS. Watch the webinar on-demand.

Video Highlights from DevCon 2018

It’s been a few weeks since DevCon ’18, and the enthusiasm, feedback and conversations keep going. We’ve seen the lively discourse that started at DevCon continue in the Alfresco Community and on Twitter. As one conference attendee said, DevCon was a “family reunion” of sorts where you’re never afraid to share knowledge, questions and your candid opinions – just like with family.

While some of you were unable to join us this year, we invite you to be a part of the on-going conversation through our community. Below, you’ll find videos from Alfrescans and non-Alfrescans covering topics such as governance services; the Share framework and its extensions; document exploration; AI integrations and more!  These “lightning round” sessions are a great way to experience the insight and excitement of DevCon, from a variety of speakers on an assortment of topics.

DevCon 2018 – Lightning Talk
Universe & Everything – Boriss Mejias
What is Governance Services? – Roy Weatherall
Governance Services, Git Workflows, and How to Contribute – Ross Gale

DevCon 2018 – Lightning Talk
Search and Analytics Roadmap – Harry Peek
Important Workarounds for Making ASS Multi-Lingual – Axel Faust
APS Extension Project – Dennis Koch

DevCon 2018 – Lightning Talk 
David and Goliath (The Misadventures of an Alfrescan vs. a Big Company) – David Ciamberlano
Alfresco Monitoring Setups with Nagios and ELK Stack using JMX and OOTB Support Tools – Cesar Capillas

DevCon 2018 – Lightning Talk
Are you thinking ahead? Let your AI do it for you – Jason Jolley
Alfresco MVC – A Seamless Integration with Spring MVC – Daniel Gradecak
PDF Template – Mohammed Gazal

For more coverage of our DevCon video content, please visit the Alfresco Community.   Please also chime in with your thoughts on the content covered and what you’d like to see more of in the future @alfresco #alfrescodevcon, #alfrescodev.

Alfresco in the News – New Journeys, Content Trends and Innovations

This week brought exciting news for Alfresco with the acquisition from Thomas H. Lee Partners. We’re delighted to grow our business together and continue to bring value for customers and employees alike. The below news recap touches on this eventful milestone but also continues to assert our thought leadership and expertise in areas such as content, process and industries such as healthcare.

CRN Alfresco Software Purchased by Private Equity Firm Vows Company Growth Will Outpace Market
Alfresco CMO, Sydney Sloan shares her perspective on the THL deal and talks about its strategic relevance.

Inc. 2018 Content Trends You Need to Know Right Now
John Newton shares his thoughts on the content marketing landscape and the role that AI plays in it.

CIO Suite 7 tips for improving information flow
The case for process and content working together is strong, as showcased by Director of Business Solution Strategy, George Parapadakis

Health Data Buzz CVS-Aetna Deal Faces Huge IT, Data Challenges 
The CVS-Aetna merger was a hot topic in the news last month. Solutions Strategy Architect, Ankur Laroia provides his commentary on what it can mean for the healthcare segment and the industry at large.

We encourage you to share your thoughts with us on Twitter or Linked:in and continue the conversation.

Acquisition by Thomas H. Lee Partners marks exciting new era for Alfresco Software

I am writing to share some exciting news that we recently announced. Alfresco has entered into an agreement to be acquired by funds affiliated with Thomas H. Lee Partners, L.P., a premier private equity firm. You can find the press release we issued this morning about the news here.

Since our founding in 2005, we have experienced incredible growth and it would not have been possible without the leadership of John Newton, our dedicated employees, our partners, and the relationships that we have formed with you – our valued customers.

I want to assure you that we are dedicated to continuing to deliver innovation across the Alfresco Digital Business Platform, with upcoming releases planned in the Spring and beyond. Our award-winning support and service teams will continue to provide the quality service that you have come to expect from our Company.

Going forward, Alfresco will continue to be the same company that you have come to rely on. Our focus on disruption in the ECM and BPM markets with open, modern, cloud ready technology will not change. This is an exciting time for Alfresco and we thank all our customers for their continued partnership.

Customers with questions, please feel free to reach out to myself, your account executive, or any of our leadership team.

For media enquiries, please contact us at

Doug Dennerline,
Alfresco CEO

BPM Cloud Architectures and Microservices – guest post from Sandy Kemsley

Continuing our guest blog series focusing on all things BPM, independent analyst Sandy Kemsley lends her expertise about the Cloud and microservices. In this post, Sandy explores the power and flexibility that the Cloud and microservices have in the process space and provides her commentary on the advantages companies can leverage when using them.  We couldn’t agree more, as both the Cloud and microservices are important aspects of our process offering. Explore how we’re innovating in this space through our recent Activiti 7 open source project. 

My usual research and writing topics focus on business process management (BPM) and systems (BPMS), content management, case management, social enterprise and a number of other related application-level technologies. Sometimes, however, I like to dig down into the technical underpinnings to explain how to deploy applications that are resilient, scalable and available. Today, that means looking briefly at cloud architectures and microservices, then looping back to what that means for BPMS’ and other applications.

Cloud Architectures

In the early days of cloud computing, “cloud” was synonymous with multi-tenanted public cloud infrastructure, such as that provided by Google or Amazon. Today, however, it has come to mean other types of virtualized platforms that use on-premise hardware, managed private servers or a combination of public and private. What’s cloud-like about these non-public installations is that they can aggregate physical servers into a pool for use by virtual environments – either full virtual machine (VM) images, or containers that share some of the underlying kernel resources – and manage the dynamic expansion and contraction of resource requirements. They can even manage environments that span public and private hardware as a single environment, either by specifying which VMs/containers should run on private or public hardware based on data sovereignty or external access requirements, or by allowing “cloudbursting” from private to public hosting to manage demand spikes. There’s a huge savings in training technical administrators and DevOps teams by using the same tools for public, hybrid or private cloud deployments, e.g., deploying applications in Docker containers and using Kubernetes for managing those containers, regardless of whether you’re using OpenStack to create your private cloud infrastructure or Amazon’s AWS for public cloud.

To sum up, you need to move beyond full VM virtualization to containerization in order to reap savings in computing resources, then allow those containers to float on a sea of virtualized infrastructure that can scale on demand, and span private and public hosts.


Service-oriented architecture – where services of one system/application are offered to another using a standard protocol – is not new. The consumption of services has shifted from heavier SOAP interfaces to lighter-weight RESTful web services, but the basic idea is the same: your application wants to perform a function that is offered by another application, so you make a call to that application, pass it some data, and it performs an action or passes back a result.

Microservices are (arguably) just a modern interpretation of service-oriented architecture, but are much more about how a system that offers services is designed: instead of a single monolithic system with multiple service endpoints, it’s a set of independently-deployable services that communicate with each other using technology-agnostic protocols. This allows individual microservices to be scaled as required, and updated to meet new business requirements without redeploying the entire system. Considering how this works with cloud infrastructure described above, individual microservices can be virtualized in their own container(s), and different microservices from the same logical system can be deployed on private or public cloud infrastructure.

The code used to call microservices isn’t much different from calling a monolithic system via its service endpoints; in production, however, the microservices offer greater resilience and scalability, and can be upgraded to newer versions more quickly with less regression testing.

What Does This Mean For BPM?

Getting back to BPM, think about how cloud infrastructure impacts your BPMS deployment:

  • Cloud architectures require a different BPM architecture. Not all BPM solutions are built for cloud-native architectures: a monolithic BPMS stuffed into a Docker container will not be able to leverage the advantages of modern cloud infrastructures, but must be deployed to the public cloud in an “all or nothing” fashion. Consider carefully your current and future needs as you consider investments.
  • Application portability drives agility. Deploying your BPMS on a cloud infrastructure allows your deployment teams to leverage private, hybrid or public cloud infrastructure to support your business strategies: for example, new requirements for mobile process participants coupled with an ability to quickly take advantage of public cloud infrastructures outside your firewall may give you a competitive advantage. Administration is also streamlined, with BPMS administration separated from the “bare metal” administration issues, and deployment teams using unified tooling for cloud administration, monitoring, and scaling.
  • Old problems, better solutions. BPM deployments are often at the leading edge of an organization’s use of technology, creating complexity for BPMS vendors in scalability, user interactions, distributed automation and security. Cloud services provide options for faster, simpler and more cost-effective solutions, such as replacing traditional multi-tenancy with newer serverless approaches. Many BPM deployments can also benefit from cloud-based intelligent services – machine learning, blockchain, IoT and more – to solve old BPM problems with better solutions.
  • Microservices drive real value. Rather than old-school monolithic BPM, a microservice- and container-enabled BPM architecture reduces the headache of creating infrastructures to address new use cases. DevOps teams can roll out new services quickly with less risk due to service decoupling, and the distributed nature of such architectures seamlessly facilitates public-facing and behind-the-firewall applications.

Use cases for external participants and mobile devices have driven the rise in the number of (public) cloud BPMS offerings, and systems that have traditionally been deployed on premise are scrambling to find ways to offer cloud capabilities without compromising security or data sovereignty.

That’s where microservices come into play. If your BPMS is designed using a microservice architecture, each of the service containers can be deployed independently of the others, either on private or public infrastructure, while still sharing data between services. Furthermore, scaling individual services – such as your process engine or audit logging, two notorious performance troublemakers – can be done without having to scale up all of the other services in step. You may even be able to skip some of the services offered as part of the BPMS, such as single sign-on, in favor of comparable services that you’re already using for other applications.

These benefits of a microservices design exist for any type of enterprise system, and that should be a feature for systems that you’re building or buying. If you’re building applications on a platform such as a microservices-based BPMS, it’s a lot easier to gain these benefits, and to optimize a private, hybrid or public cloud deployment.

Are you currently using or considering a cloud and/or a microservices based infrastructure? Let us know the ways you’re using this technology to innovate. Join the conversation on Twitter, @skemsley, @alfresco

Customer Insights: What we learned from Liberty Mutual about Migrating to Alfresco on AWS

Today’s webinar on How to Quantify the Strategic Impact of Migrating Your Content to AWS provided clear insights into how our customers realize value from deploying Alfresco in the Cloud. I was inspired by Mojgan Lefebvre, CIO & SVP of Global Specialty at Liberty Mutual as she shared the “why and how” of implementing Alfresco on AWS to help improve the speed and efficiencies for their staff within their Global Specialty unit. Walking through the process of understanding their challenges with previous ECM systems to knowing what new requirements to look for, and also how to deploy, it was great to learn how Alfresco was the only technology that met their needs. But what really brought this powerful customer story to life were the qualitative and quantitative benefits Liberty Mutual shared as a result of using Alfresco on AWS. To give you a quick preview of those benefits:

• $21MM in savings over a 5-year period
• Delivery of a fully functional environment in 30 minutes
• 500,000 documents stored – and growing!

Coupled with the Liberty Mutual example, we also invited experts from Forrester Consulting to underscore the value of Alfresco by reviewing results of the recent Total Economic Impact Study. Dean Davison, Principal Consultant at Forrester cited additional customer examples and scenarios where Alfresco proved invaluable in terms of providing strong financial savings, flexibility of deployment and an accelerated time to value.

I invite you to explore this webinar as well as the Forrester Total Economic Impact study.
Watch webinar

Alfresco DevCon 2018 – Bee One with the Community

Building off the strong momentum from last week’s DevCon 2018 in Lisbon, we’re feeling invigorated by the positive energy, ideas and enthusiasm we witnessed from our community. Leading with the product roadmap from Thomas DeMeo and Brian Remmington, the developer community gained insight into the future of Alfresco and were able to choose from a vast array of technical sessions, honing their skills in content, process, application development, analytics and much more. John Newton’s visionary keynote may have only been upstaged by the beautiful setting of Lisbon, Portugal, which inspired our speakers and attendees to create meaningful discoveries, collaborations and conversations. One thing we’d like to change for next year …filling the room with more talented women developers!

Here are some highlights that underscore the experience of DevCon18:

From the 70s to 2030s – 60 years in 60minutes, all being covered in one session! It can only be John Newton #johnnewton #alfrescodevcon

— ecmcoreaveri (@ecmcoreaveri) January 18, 2018

My lightning lightning talk #alfrescodevcon

— Charles Le Seac’h (@cleseach) January 18, 2018

#alfrescodevcon collaborative working #hackathon

— BlueXML (@BlueXML_Corp) January 16, 2018

It’s a beautiful day in the neighborhood for #alfrescodevcon especially with male model @HarryAPeek in the frame.

— Chris Wiborg (@cwiborg) January 17, 2018

#alfrescodevcon great way to end the con

— ross gale (@w0ssabi) January 19, 2018

If you missed DevCon this year or would like to recap what we discussed, connect with us on the Alfresco community where we’ll also have recordings of last week’s sessions, available soon. And please continue to join the conversation on Twitter #alfrescodevcon.

Transforming Insurance with Cloud BPM – guest post from Sandy Kemsley

We’re delighted to introduce a blog series, featuring Sandy Kemsley, independent analyst and systems architect, specializing in Digital Process Automation (DPA), Business Process Management (BPM), the social enterprise, enterprise architecture and business intelligence.

Sandy will lend her expertise and commentary on topics such as digital transformation and the cloud, the convergence of process and content, the role of microservices in cloud-based BPM software and more. We hope you enjoy her insights and provide us your comments about these topics on Twitter @skemsley@alfresco

Transforming Insurance with Cloud BPM

Understand the customer demands and business models for today’s insurance business, and plot a path along the maturity model to technology modernization.

In this blog post, I plan to explore the role BPMS plays in integrating packaged software, custom-built systems, and external services into a seamless process that includes both internal and external participants. What if you need to include customers in your process without having to resort to email or manual reconciliation with an otherwise automated process? What if you need employees and partners to participate in processes regardless of their location, and from any device? What if some of the functions that you want to use, such as machine learning for auto-adjudication, industry comparative analytics on claims, or integration with partner portals, are available primarily in the public cloud?

Let’s consider the case of an established property insurance company that wants to expand into online insurance in order to grab a piece of the market of younger people who are just getting their first apartment. Although there are older consumers in this market, younger consumers are more likely to want to make financial services purchases online, particularly on their mobile devices, instead of dealing with an agent or going to the local branch of their financial institution. The problem is that the insurance company has a lot of non-automated processes, and the automation that does exist is buried in on-premises systems of record (SOR) such as legacy policy administration and claims systems, as well as monolithic on-premises BPMS. They also have a conservative culture, averse to change and mistrustful of new intelligent automation technologies. All interactions with their clients are on paper, usually by postal mail. In other words: slow, inefficient, unwilling and unable to change quickly to respond to the new market demands, and ripe for disruption.

This results in some very different business models and needs for the old and new world of insurance:

Old World Insurance New World Insurance Customer service model High-touch agent interaction Customer self-service Use case All insurance needs Straightforward property insurance Process initiator Paper documents Data entered by customer on web or mobile app Transactional processes Manual procedures, with some automation embodied within systems of record Automated processes and decisions with human intervention only for exceptions Customer interactions mid-process Calls and paper mail Web or mobile app


It’s not that easy to just start a new online insurance company from scratch; instead, the company needs to marry their existing reputation and stability with a technology architecture that can better serve the needs of the new market. Changing out policy administration and claims management systems is not an option for the short term, and the company doesn’t want to move those off premise due to perceived problems with data security and sovereignty. How, then, does an existing company modernize their architecture to address the new world needs?

Here’s a maturity model for insurance company technology modernization:

Level 1: Prepare systems of record for automation. Enable/expose API service interfaces to the SORs used for underwriting, policy administration and claims management, so that data can be read and written, and functions invoked within those systems. The ability to do this is dependent on the SOR platform, and may require signification customization or the creation of an integration layer, but is essential to create automated customer-facing processes that interact with the SOR.

Level 2: Prepare SOR for integration and scalability. Move the systems of record into a containerized on-premises/hybrid cloud infrastructure for easier integration with other cloud infrastructure, and simplified administration. Due to the company’s conservative data security and sovereignty policies, these systems will remain on premise but will more easily be able to interact with public cloud infrastructure.

Level 3: Develop customer-facing self-service processes using a cloud-native BPM. Create processes that follow the customer journey, allowing customers to initiate and participate in flows that create policies, update their policy information, and manage claims via the SOR service APIs. Internal tasks that cannot be fully automated will be routed to an internal worker as required. The customer-facing portions of the processes will execute in the public cloud – in a highly secure manner – but very little data needs to be persisted outside the company’s SORs, minimizing data security concerns. I believe in the advantage of cloud-native BPMS deployments, and the benefits they bring in situations where you’re bridging between on-premises legacy systems and public cloud customer interactions.

Level 4: Make processes more automated and intelligent. Incorporate intelligent automation technologies, including machine learning, artificial intelligence and third-party industry analytics, to fully automate the customer-facing processes where possible. For example, most rental or homeowner policies can be issued without employee intervention, and many types of claims can be adjudicated automatically and paid out immediately.

By implementing integration and automation modernizations with a cloud-native BPM, the insurance company can create and deploy customer self-service processes and specialized insurance products faster, providing a significant competitive differentiation. A cloud-native infrastructure also provides elastic scalability for dynamic scaling: for example, in the case of a regional flood or wildfire, the claims process can scale dynamically to handle the increased load, reducing the risk that an anxious customer sees a website or application that won’t load due to traffic.

Although I’ve put the recommendations in this post in the context of an insurance company, there are potential benefits for any organization looking to modernize to a containerized cloud architecture and improve customer-facing processes:

  1. Applications, both internal and customer-facing, can be assembled from independent services including legacy APIs, BPM microservices and third-party cloud services. This allows them to undergo rapid iteration to match the speed of your business.
  2. Containerized hybrid cloud applications can be quickly ported from development to test to production environments, whether on premise or in a private or public cloud, with minimal risk of incompatibility.
  3. Cloud applications can be scaled up (or down) automatically to meet demand, providing better cost efficiency.

Let us know about the ways you’re modernizing your BMPS – leave us a comment on Twitter @skemsley, @alfresco

Alfresco in the News – 2018 Technology Predictions and Insights

With the new year ahead of us, we’re pleased to share with you some recent news coverage that features Alfresco’s continued product innovations and business insights from our experts. We’ll be bringing you news coverage several times a month so you can continually see the new, exciting things we’re working on – and what others are finding helpful about our solutions.  Thanks for reading and sharing your thoughts with us on LinkedIn or Twitter @alfresco:

ZDNetHow to execute on strategy: Getting great ideas from the whiteboard to the boardroom – top CIO leaders lend their perspective on how to bridge strong strategies with execution. McDermott International CIO, Akash Khurana highlights how Alfresco is key for delivering open and connected systems.

Website Magazine: 2018 Digital Forecast – Global business leaders shed light on this year’s marketing and technology predictions. Alfresco Strategic Solutions Leader, Ankur Laroia lends his commentary on data security.

CRN 10 Security Predictions For 2018 – This predictive piece also features Ankur’s insights about the growing importance of the Chief Security Officer and the Chief Information Security Officer in today’s business organizations.

DevOps: The Value of Commercial Open Source – Alfresco’s VP, Product Marketing, Chris Wiborg shares his thoughts on the value of commercial open source platforms to innovate and meet unique customer needs.

We look forward to engaging with you about these and other future topics on our social channels. Drop us a comment on LinkedIn or Twitter @alfresco: