The real-time web

This post was originally published on AIIM's Expert Blogs by Serge Huber, CTO at Jahia Solutions

********

In a world where everyone wonders what will be next in social collaboration, process automation or electronic content management, this article actually presents a new way of looking at the state of the web today.

Today I'd like to talk about "the real-time web". In computing, real-time can mean a lot of different things, but usually it means that a given task will be achieved in a consistent amount of time and that the total time will be imperceptible to the end-user. For example, in a music application this means that any sound processing must minimize the delay between the input and the output, in a video editing application it means that any processing must not impact the delivery of images on the screen.
 
So what does this mean for the web ? I think we can already consider Facebook and Twitter as good illustrations of what the real-time web can look like, but they also illustrate the large requirements involved, as both require a massive amounts of back-end infrastructure.
 
For the enterprise, the real-time web is already a reality. Internet marketers already use real-time analytics and social tools to instantly track the effectiveness of their latest campaign, and users are already quite used to dealing with Twitter's quirks. The benefits of having a real-time web site or an intranet can be multiple : people can always see the most up-to-date information, they can use collaborative tools such as forums or instant messengers to communicate, work on documents together simultaneously, or interact with clients effectively to either promote ideas / products or to react quickly to a customer query or comment. This has multiple benefits such as improved communication and productivity but may also prevent work duplication, improve feedback time, allows for better interaction on ideas and improve overall customer satisfaction.
 
An example in standard committees work : although face-to-face meetings happen, they are costly because they require travel costs and a lot of time involvement, whereas working remotely using technologies such as Google Documents, online virtual meeting (with screen sharing, video, voice, etc..) make it possible to interact in real-time with the different people involved. Of course this will never replace real face-to-face meeting but still illustrates the power of real-time communication tools.
 
Content governance is also an issue in the real-time web. In order to be close to real-time, content publication or creation must be as fast as possible, but with this comes a higher risk of improper disclosure or more simply mistakes being made. It is very difficult for companies to vet every single communication, be it internal or external. Externally you could make the point that real-time is not (yet) a necessity, but internally it is quite the opposite. People need the information immediately, especially in today’s very competitive world. Some governance may start to happen "post the fact", meaning that if something goes wrong the content might be removed, improved, or a damage control procedure might be used in order to responsibilize the users and prevent this from happening again. Of course there are scenarios where governance must be much more strict, but this will be at the cost of speed of communication and less and less companies will want to opt for this unless they are forced to.
 
Another difficulty for real-time exists in web social technologies: the constant nuisance of spam. As dubious companies are promoting their products by trying to improve their Google rankings by spamming site forums and comments, it is becoming a real necessity to be able to react to spam in real-time, to prevent it or at least to remove it as quickly as possible. It is possible here for example to use real-time heuristics to detect the presence of a spammer, for example by analyzing the rate of posting, as it is usually automated to generate lots of content.
 
Process automation and management are probably the areas where there is still a lot of work needed to help improve the speed of processing. In order to become real-time, or at least as close to it as possible, automation needs to be considered from this new point of view, and should rely as little as possible on human availability. Processes often get "stuck" because resources are not available and may even be "forgotten" if the automation does not properly include reminders or notifications. Workflow technologies can now use push notifications directly to instant messengers or even mobile devices to make the process even faster, and this is clearly starting to happen on a larger scale. Processes must also be kept minimal, but this is not a new requirement. As usual in the world of process automation, there is a strong need for interoperability, and new standards such as CMIS, WEMI should help, although they are either quite young or not yet specified :) The more interoperable the systems become, the easier it will be to automate them, exchange content and work through the processes to get the information out as quickly as possible.
 
On a technology standpoint, a lot of existing or upcoming technologies can help build high performance websites or internal system, but some of them are more or less compatible with "real-time".
 
Caching is an essential part of high-performance web sites, but it is really a double-edged sword. On one side it will help deliver content very efficiently, but it adds an amount of complexity (especially if cache entries may embed each other as may be the case on some WCM systems) in the case where invalidation must happen as soon as the content has changed. In highly personalized systems caching might be reduced to a minimum, replaced by more powerful business-specific algorithmics, and we are starting to see this happening in the “Big Data” world. Another interesting example is Twitter’s implementation of their latest message feed, that uses the social connections of users to rebuild the list and avoid having to rebuild all queries from scratch.
 
More interestingly, recent standards such as PubSubHubbub and WebHooks make it possible to push notifications back to a user on the web, and new implementations of existing protocols such as the recently announced HTTP 2.0 draft are clearly going in the direction of reducing technological overhead to get content to and from users more efficiently.
 
Web Content Management software is also evolving to make it possible to more easily interact with content, using AJAX technologies to send and retrieve short data and make the user experience seem real-time. A lot of other software products are going in the same direction as they are embracing both the web and the mobile platforms, and they are being inspired by a lot of what was done previously on the web. It is for example no longer acceptable for a user to wait for a request to process without any sort of feedback. If the feedback is quick enough, and sufficiently detailed, and all this happens under one second, the user will have the impression that it is communicating in near "real-time".
 
Another important part of delivery high performance real-time experiences are the new cloud offerings. Using Amazon AWS or Google AppEngine clouds make it possible to scale the infrastructure to demand, and therefore make it possible to ensure that the user experience will remain consistent even if the number of users increases over time. Before these offerings, as a service became popular it was not easy to react quickly to growing demand, and the service might become unavailable, which in the aim of offering a real-time experience is not something acceptable.
 
Finally, in order to fully get the benefits of real-time, content must be accessible and produced whenever and wherever users are, and this puts the last piece of the puzzle in place : mobile platforms. Since the new dawn of smartphones, users are accessing and even producing content on the road, and also are starting to expect responses much faster than before. Whether this is a good thing for stress levels is something I will let others answer, but it is clear that it is becoming a new expectation that content should be accessible as quickly as possible. For example, if a user posts an image on Facebook, he will expect to see it almost immediately on his profile. This is (still) technically a challenge for the engineers designing these systems, but it is a user expectation that must now be fulfilled, whereas just a few years ago it was still acceptable to make the user wait a few minutes for image processing. So the mobile platform must be fully integrated with the other components of a "real-time" system, usually through usage of technologies such as push notifications, instant messaging or background downloads, to ensure the user stays active and has an experience that will encourage use of the systems.
 
As you can see, a lot has happened to build the real-time web in a very short amount of time. New services are defining new expectations and we are getting closer and closer to a fully real-time web. How long will it be until we can directly edit video on the web or have full HD video meetings ? Actually this is already possible. Now the real challenge for companies is to embrace these technologies in new ways to increase their productivity and communication, and at the same time software offerings must adapt to answer these new demands.

Serge Huber

Serge Huber
Serge Huber

Serge Huber is co-founder and CTO of Jahia. He is a member of the Apache Software Foundation and Chair of the Apache Unomi project management committee as well as a co-chair of the OASIS Customer Data Platform Specification.

Back