Supervisory Control and Data Acquisition (SCADA) is widely used by railways, highways management, power utilities and the oil & gas industry, among others. It brings an end-to-end supervisory system which acquires data from the field through Remote Terminal Units (RTUs) or Intelligent Electrical Devices (IEDs) and connects it to sensors through a communications network.
The oil industry employs SCADA technology to monitor offshore and onshore extraction, for instance.
Some pundits are predicting the end of SCADA in the near future. However, a recent TrackTalk article by Thierry Sens, Marketing Director Transportation, Oil & Gas Segments, Alcatel-Lucent, entitled with the same question posed above, Is M2M killing SCADA?, arrives at a different answer. Sens argues that SCADA instead will adapt and include M2M, which is closely related to the Internet-of-Things (IoT) megatrend currently sweeping the consumer world.
After three-generations of SCADA (standalone SCADA, distributed SCADA and networked SCADA), industries such as the railways are now using M2M for part of their SCADA needs.
“M2M is revolutionizing SCADA by offering standardization and openness,” noted Sens. “Indeed several communication protocols between a backend and a machine have been standardised by the Open Mobile Alliance and the Broadband Forum. M2M is also providing scalability, interoperability, and enhanced security by introducing the concept of middleware.”
With middleware, the fragmented SCADA solutions with individual sensors talking only to their respective backend applications can be eliminated.
“Middleware collects, syndicates and manages all flows using open communication standards and exposes the data through standard APIs and Web Services,” noted Sens. “This has enabled the development of business applications and business analytics software on top of this middleware which can compute the information collected from millions of devices.”
Once example of this is the Swiss Federal Railways (SBB). They have been pioneering the use of M2M to improve the efficiency of applications in use on 3039km of lines across its network with the ultimate goal of reducing costs by up to 15 percent by 2018. Their step-by-step rollout is looking to deliver efficiency savings and an improvement in operations, and does not emphasize a single technology or specific area; it covers telecoms, operations in areas such as point maintenance, fibre optic systems and rolling stock fleet monitoring, as well as other areas.
“M2M is considered the next phase in the evolution of SCADA and logical platform for an upgrade when the time is right,” concludes the Alcatel-Lucent blog post on the topic. “It finally offers a standardized, scalable, inter-operable and future-proof solution that does not tie a customer to a single supplier but still delivers the improved efficiency and reduced costs associated with SCADA applications over the past 40 years.”
And that’s a good thing.
]]>In doing preparations for what many have called “the circus” aka the annual Mobile World Congress (MWC)—which is arguably now the most important industry trade event as the world goes mobile and which takes place in Barcelona March 2-5—the excitement is already palpable. From all of the new cool devices of all shapes and sizes to interesting advances on things like antenna technology, Network Functions Virtualization, carrier aggregation, etc., the eye candy alone is almost overwhelming in terms of imagining the possibilities. However, MWC always is tantalizing because not only does it answer what we will see in terms of capabilities in the near-term, but also because of the questions it raises about the longer term.
In this regard I was struck by a recent blog by Michael Peeters, CTO, Alcatel-Lucent Wireless, entitled I think appropriately, The Circus is in Town. Peeters’ main point is summed up nicely where he says in characterizing his view on what’s the next big thing that will be the buzz of the show that, “...one thing is certain: its story will be about removing place and time constraints we took for granted.”
While I agree with where we are and where we are going to a certain extent in terms of removing place and time constraints, I happen to believe that the next big thing will be around what it has been and should be, i.e., utility. After all, as we move into always on/all ways connected broadband-enabled world, if all of the things Peeters points to about the Internet of Things (IoT), drones, wearables, more immersive customer experiences and the like are not both easy-to-use, trustworthy and useful, their monetization potential will not be maximized. And, let’s face it, the bottom line is the bottom line which is all about utility. This means utility as pervasively accessible and hopefully affordable and safe, as well as the perceived value we are willing to pay for personally and professionally that enables service providers to continue to accelerate the speed at which the future comes at us.
Talk of the town
If you think about MWC are more of a town meeting of all the stakeholders, along with being a good thing in the context of being a circus, which it is, Peeters’ observations about what will be discussed in terms of the short-term are spot on. In fact, his list of things that will be highlighted and discussed is worth review if you are lucky enough to go to Barcelona or not. They include his forecasts concerning what the show buzz will be about:
Applicability: LTE in new markets such as Public Safety. The ongoing deployment of VoLTE and potential further improvements. What needs to be added or changed to the existing technology?
Capacity: carrier aggregation in licensed bands, be it FDD or TDD, but also the combination of, and the tension between licensed and unlicensed spectrum and technologies. Do you go for LTE-U or Wifi-LTE aggregation?
Performance: specific deployment scenarios such as small cells for indoor and especially enterprise needs. Virtualization of the RAN. How and when do they make sense?
And, because the industry loves to discuss what comes next, despite 4G now finally being rolled out around the world, although “mature” might be a stretch given how far we have to go, 5G will be top of mind and interactions. Don’t get me wrong, this is a great thing. Who doesn’t like talking about the future? However, with things like VoLTE, Voice-over-Wi-Fi, Hotspot 2.0, IoT and M2M, really all in their nascent stages, my hope is that the industry is not getting ahead of itself. Indeed, the use cases that will emerge as to what is valuable as the mobile world moves toward being all IP and broadband are in most instances yet to be written.
This is a great thing as well. It is a so because it will be us figuring out the utility of new high-performance wireless networks and how to extract value from them and all of the new devices, business models (mobile payments for example) and competitive options that will determine what will be successful along with the who, what, when, why and how.
So as Peeters implies, by all means enjoy the show. Be entertained and enthralled. Like the circus, MWC is dazzling and if for no other reason can and should be appreciated for that alone. Indeed, take him up on the invitation to stop by the Alcatel-Lucent booth (Hall 3 – Booth 3K10) to learn more about the realities and possibilities.
In many ways we stand on the bottom of the on-ramp of the possibilities of the combination of pervasive computing and ubiquitous communications. The coming ability of networks to deliver more immersive experiences and better insights into how we can be more productive at work and enjoy more of the things that delight us in our personal lives makes this a unique time and a great time to be part of the buzz.
However, it is important to not lose sight amidst the technology of not just what it does but what it can and should deliver. In my mind the deliverables are utility and trust. It is my hope that at a high level these are the two things that are buzzing at MWC as they are what the industry should and can deliver as to what comes next.
Peeters has it right about the inexorable march of the industry toward providing us with the broadband infrastructure and agility the future is mandating in terms of breaking down the barriers of time and place, however, what this means in terms of deliverables creates fascinating open questions and opportunities and that is why the show is so engaging on so many levels. This may not a “new story,” but it certainly is an all important one.
]]>In business as well as our personal lives there are finite resources that gate our activities. The big one that covers both is time which we cannot create more of and hopefully optimize for obvious reasons. In mobile communications the issue is getting the most out of not just the finite but scarce radio frequency (RF) spectrum allocated for service provider networks.
Realities are that in most parts of the world mobile service providers have access to different frequency bands as a result of things like auctions and mergers. Thus, they have a need to mesh their various spectrum assets (i.e., bands and associated carriers) in general. They also must optimize them to meet the insatiable appetite of customers for bandwidth-hungry services such as real-time and streamed video where Quality of Experience (QoE). Indeed, QoE and its extensibility to cover anywhere a customer is located is now foundational for attracting and keeping customers.
The challenges of creating fatter pipes that can deliver the bandwidth the tsunami of traffic headed operators’ ways are daunting to say the least. It is one of the reasons why records continue to be broken at auctions for the spectrum that is being freed up by policy makers. To say the least, getting more bandwidth and extending it closer to the customers has critical competitive implications, and this has become a paramount concern specifically in the now hot race to deploy 4G LTE and now 4G LTE-A (Advanced) services. In fact, there is a need for speed by customers, and a need for speed to deploy mobile broadband services for consumers and enterprises at express speeds ASAP.
A recent TechZine posting, LTE carrier aggregation and the massive capacity challenge, by Hector Menendez, Senior Marketing Manager, Wireless Solutions Marketing, Alcatel-Lucent, as the title says highlight how LTE carrier aggregation (CA) can help mobile service providers optimize the bandwidth they have to meet growing traffic demands and provide the QoE required to be competitive.
As Mendendez explains, “CA lets operators aggregate these disparate chunks of spectrum spanning across different bands by supporting inter-band CA.” This is the most common use for CA as most spectral assets that operators own have been acquired piecemeal over time.” He adds that, “In some markets, many operators are also turning to LTE-TDD as a way of further augmenting capacity of existing LTE-FDD networks…and can use intra-band CA to combine several carriers to achieve higher speeds as a way of differentiating their services.”
Additional Carrier Aggregation Benefits
You might think that on the basis of creating more bandwidth alone that CA would be attractive, but there is more. CA also enables operators to make better use of network resources through load balancing, and as pointed out in the posting can reduce interference and improve network performance via intelligence allocation of resources.
In fact, in many ways CA is like a Swiss Army knife, as the graphic below shows.
Source: Alcatel-Lucent
Menendez concludes that: “Carrier Aggregation represents one of the most cost-effective and efficient way of addressing the capacity challenge and could be the biggest success of all LTE-A features. All indications are that we will see CA go mainstream in the not too distant future giving operators a truly valuable tool.”
That might sound like hyperbole, but given the scarcity of spectrum, the unusual mix of radio assets in most operators’ footprints, the need for speed to satisfy traffic demands in general and customer expectations, and the need to be competitive, Menendez is more than directionally correct. Mobile service providers have a growing sense of urgency to be fast-to-market, fast-in-the-market and best in market and certainly when it comes to getting LTE-A, and other advanced forms of wireless technology into the hands of customers ASAP, CA is going to be a critical part of the equation.
]]>As populations increasingly migrate from rural to urban areas, power utilities face new economic challenges and opportunities around creating and maintaining adaptive grid communications network infrastructure.
The dynamics of this global change are fairly well known, although how to address the challenges isn’t so obvious. For example, cities consume three quarters of energy and contribute 80 percent of CO2 emissions globally, according to a recent report in The Guardian. How can that energy be most effciently delivered, with minimal environmental impact?
Consensus is emerging that what’s needed are smarter, safer, greener cities. Governments and municipalities are under pressure to invest in sustainable infrastructure capable of efficiently delivering services to citizens and workers.
There’s a pretty compelling smart grid transformation opportunity for public-private partnerships embedded in this evolution. Together, telecom service providers and information and communication technology (ICT) providers can bring in their assets, expertise and experience to help power utilities meet goals for smart grid applications.
Smarter energy management for power utilities is an imperative, but that doesn’t mean it’s easy to achieve.
ICT is an important driver of economic competiveness, livability and environmental sustainability associated with smart grid transformation for smart cities, noted Marc Jadoul and Jacques Vermeulen of Alcatel-Lucent in a recent TechZine article, “Smart practices for building smart cities.”
“The right ICT infrastructure will affect the way each city will be created and evolve,” Jadoul and Vermeulen noted. “It will enable smart cities to include vastly enhanced sustainable areas, such as smart buildings, smart infrastructures (water, energy, heat, and transportation) and smart services (e-substitutes and e-services for travel, health, education, and entertainment), which drastically change the urban experience for city dwellers and travellers.”
Using broadband networks to provide access for high-capacity communications infrastructure, the city net becomes the backbone of a smart city. Creating that backbone requires investment in an open data approach flexible for a variety of applications that benefit both the city and its population.
This infrastructure foundation opens up opportunities to optimize a city's public infrastructure, including a smart grid to reduce CO2 footprint and lower energy bills. For example, wireless sensors can continuously monitor and control lighting.
Other important aspects of an effective smart grid include real-time remote grid monitoring substation automation, smart metering, and green energy devices.
]]>The world of M2M is changing as solutions move from single purpose devices that transmit data to and receive commands from an application in the network to an Internet of Things where solutions permit devices to be multi-purpose and applications to be collaborative.
The Internet of Things can benefit from global standardization efforts that:
In today’s, world M2M solutions abound and not much architecturally has changed since the 1970’s. The Fraunhofer Institute for Open Communication Systems has described the definition of M2M as communication terminal independent of human interaction communicating with a core network or another terminal for the purposes of automating services.
Granted that while the network that facilitates the M2M communication has changed dramatically since the 1970’s and provides quite advanced capabilities (e.g., 3GPP Machine Type Communication), the architecture of the M2M solution has remained fairly static – A device in the field that communicates with an application in the core network for a specific purpose.
However, we are beginning to see paradigm shift for M2M, called the Internet of Things (IoT). Powered by the infrastructure of M2M, the IoT fundamentally changes the way devices and applications interact. We can look at this progression of how devices and applications collaborate using technologies enabled by the M2M infrastructure in much the same way that people collaborate using the social Web or in how commerce has been enabled using Web 2.0 technologies.
In the world of IoT devices once had a single purpose. But now it provides data or can be controlled for varying purposes across industry domains. For example, a pedometer can be used by:
It is the same pedometer but the data is used by different application domains.
Since the IoT is enabled by the capabilities of the M2M Service Enablement Layer, the IoT domain draws upon many of the benefits provided by global standardization efforts like oneM2M that help solve the challenges faced by the M2M industry today. Benefits like:
However, because the basis of IoT is the multi-purpose collaboration of “things” (e.g., pedometers, storage containers, and energy meters) there are challenges that are accentuated in the IoT domain like:
In this context, standardization provides benefits that enable this type of collaboration.
In fact there are global standards bodies working on these challenges today. They are providing definitions to aspects of the collaboration across application frameworks that enable an application development and execution ecosystem and provide a clear definition of interfaces for application providers and device (Thing) manufacturers. For example the work in the Home Gateway Initiative (HGi), W3C, Open Geospatial Consortium (OGC) and oneM2M in the area of semantics in IoT. They are standardizing a common vocabulary and associated templates for “things” to be described in a context that suits the varying purposes of the “thing”.
One of the key issues in the exchange of semantic information is how the privacy and confidentiality of the information source can be maintained while still providing the needed semantic context. The capabilities to provide rights to the information source and anonymize the semantic information are just a few of the security that standards bodies like the W3C, IETF, ITU and IEEE are actively pursuing. The industry realizes that if privacy and confidentiality isn’t designed in up front and on top of the security capabilities (e.g., authentication, access control, data protection) provided by the enabling M2M infrastructure, the benefits of the IoT cannot be fully realized.
Realizing this, oneM2M is pulling these semantic vocabularies together in a framework that enables applications to efficiently discover, exchange and analyze semantic information across industry domains while providing the capabilities to ensure the privacy and confidentiality of the semantic information sources.
Standardization of the Internet of Things may seem like big hurdle to leap but if these organizations are successful, then the IoT is going to be a much friendlier place to work and live.
About Tim Carey
Tim Carey is the Industry Standards Manager of Alcatel-Lucent’s Customer Experience Division. Tim was recently inducted into the Broadband Forum Circle of Excellence to recognize his leadership in advancing the Forum's mission of driving broadband wireline solutions and empowering converged packet networks worldwide to better meet the needs of vendors, service providers and their customers.
Tim has over 18 years experience in the communications industry, working in the areas of solution deployment, system engineering and system architecture across a wide variety of technologies that include Optical, ATM and IP transport, switching and routing products as well as development of Home Networking devices and Network and Device management systems. In his current role as Industry Standards Manager, he is actively involved in a number of standards bodies that include oneM2M, ETSI, IEEE, Broadband Forum, Open Mobile Alliance, HGI, DLNA and UPnP forum providing expertise in the areas of network management, device management, home networking and machine to machine technologies.
What if you didn’t need to have your phone beside you at all times? What if instead, you can use your own car to connect with you, direct you and protect you wherever you go?
Well, by 2022, a Telefónica Industry Report (PDF) predicts that there will be 1.8 billion automotive Machine-to-Machine (M2M) connections that can do just that. This will comprise 700 million Connected Cars and 1.1 billion Internet of Things (IoT) devices for services such as navigation, insurance, stolen vehicle recovery (SVR) and infotainment. In fact, Machina Research predicts that by 2020, 90% of new cars will feature built-in connectivity platforms, growing from less than 10% today.
Connected Cars will not replace smartphones - merely it’s a way to extend the IoT connectivity and bring the everyday lifestyle right to the car. Ellis Lindsay’s blog on Connected Cars as an everyday lifestyle does a great job of explaining this concept. He goes into detail about connected cars giving us the ability to link our life experiences – whether it’s our deadlines, travel plans, monthly payments or Facebook notifications – to wherever we are and wherever we go.
I really think this quote by Henry Ford “If I had asked people what they wanted, they would have said ‘faster horses’” explains the evolution of Connected Cars perfectly. Connected Cars have changed the way we approach the future of communications and if I had the chance to ask Henry about the future of Connected Cars, he probably would have repeated the same quote. Another great Ford leader, Steven Odell, an EVP at Ford Motor Company, believes “Cars are the smartphones of the future” stating that 79% of industry experts believe the IoT connectivity will soon be the primary decision in car purchases and that 80% of cars will be connected by 2020.
Today’s primary decisions in making car purchases are pricing and gas mileage, without a doubt. However, when the M2M players and service providers join forces to bring connected cars to end users, they will have to realize the challenges of bringing M2M to the masses. These challenges include simplifying the use of technology and creating an experience where they consistently feel connected to their everyday lives. Oh, and let’s not forget the millennials’ preferences to self-service. There will be more to purchasing decisions than pricing and gas mileage – the means of having voice over LTE (VoLTE), push notifications, customer self-support, and mobile data will be major players in the decision.
All of the services of the Connected Car I mentioned have two important elements: It further connects us to our everyday lives and enhances the customer experience. This enables us to have our cars become our smartphones. Just imagine, making payments right from your car, having your car find the best available parking spots and not having to worry about the maintenance of your car as there are automatic vehicle system checks, firmware updates and data management services. What else could you ask for?
To learn more about Connected Cars, take a look at the ng Connect program. Is there an application or service you think every Connected Car should have? How big of a role will the IoT play in that service? Please share your thoughts in the comments below. I’d like to hear from you.
When he’s not blogging or tweeting, Anthony Trinh (@Trinh_Anthony) is a third-year marketing and information systems student from Carleton University in Ottawa. He is currently completing a co-op term as the Integrated Marketing Assistant for the Motive marketing group at Alcatel-Lucent.
]]>Talks on the topic of Connected Cars have been escalating over the years. It’s a recent Machine-to-Machine (M2M) technology which extends our connectivity to the world around us – bringing your vehicle to your lifestyle. Consumers are looking at Connected Cars as a technology which can be used to help protect you, direct you and connect you. A recent survey done by Accenture on Connected Vehicles reveals that 50% of consumers in 2014 are using Connected Cars for traffic information and 34% of consumers indicated an interest in getting information about available parking spots.
Connected Cars will give us the ability to bring the home experience right to our cars. By bringing M2M to the masses, the technology will help us manage our life better – our cars can remind us of calendar deadlines, help us to schedule our meetings, and then plan our trip with the least amount of traffic – all while finding the perfect parking space.
One of the many things I realized from Telematics Detroit 2014 was the concept of the digital lifestyle. And I couldn’t explain it any better than how Tom Gebhardt, president at Panasonic Automotive Systems of America, explained it at the event. “Without the car, you have a huge gap in being able to link your life experience. Consumers don’t want to have to change their technology just because they’re in the car,” said Tom.
What Tom said really won me over. It’s not so much about the Connected Car itself, it’s really about the lifestyle and customer experience. The car is only a technology which enables us to improve the two. Every day we are finding new opportunities to stay connected, whether it’s through downloaded apps, navigation services or new technologies. The Connected Car presents all of these features, along with the most important feature: location.
As I mentioned earlier, our network is always on. That means news feeds, notifications, changes of schedule and many other things that you would have known about if you were at home. But as soon as you step outside of those doors, you leave behind a suitcase filled with information. It is about being everywhere, while accessing everything. That’s what the world of Connected Cars brings to life.
To learn more about Connected Cars, take a look at the ngConnect program. What do you think is the most important feature of Connected Cars? Share your thoughts in the comments below. I’d like to know what you think.
In short, the networking piece of smart grid deployments is critical, as the migration of utility infrastructure to meet the needs to remotely monitor and manage their grids grows in complexity. “The new IP/MPLS technologies offer a great deal of benefits within the utility in cost savings, operational efficiency and cost savings, and they also mandate a new way to operate, bridging those traditional organizational silos,” noted Mark Burke, VP of Intelligent Networks and Communications for DNV – GL, in a recent GridTalk posting.
In the first of what will be a three-part series on smart grids and the value that can be created the economic perspective provided by Burke, who outlined a number of ways that smart grid projects can maximize their technology investment, is instructive.
Source: To download GridTalk click here.
Burke provided a number of recommendations that will resonate with utilities as they evaluate how to move forward, and here are five of particular significance.
First, embrace standards.
“Standards drive innovation while maintaining security and reliability”, noted Burke. “They encourage mass production, driving the cost of equipment down, while also enabling the development of an ecosystem of value-added products and solutions. They reduce the total cost of ownership for the systems they procure, but also encourage the development of devices that may leverage communications protocols to provide additional value to customers and the society at large.”
Second, monetize excess capacity.
Adding high-bandwidth fiber is a big investment, but it also can create new profit centers that can help projects recoup the investment. One way to monetize excess capacity is through establishing an unregulated subsidiary tasked with taking advantage of the newly-created infrastructure.
Third, prepare for distributed power generation.
One side benefit of deploying a converged IP/MPLS system is readiness for distributed generation and renewable energy sources such as wind power and photovoltaics.
“With IP/MPLS you can accommodate these in a common architecture and in a standards-based environment so that the unit costs are low over time,” noted Burke. “As the smart grid environment matures and gets more diverse with renewables and other new challenges, the total cost of implementation will go down as well.”
Fourth, consider public-private partnerships, especially where investment in energy infrastructure is traditionally lacking. In some markets, demand response can’t be exploited by governmental entities or utilities but third-party, commercial concerns can capture that value instead of leaving it on the table.
Finally, it is worth the effort to help shape public regulation.
“Power providers should really get involved with educating both consumers and regulators, which may have very small staffs and are overburdened,” Burke suggested. “Only through that level of engagement will help maximize the value of moving to a more sophisticated energy system including IP/MPLS-based smart grid. The regulators are quite sensitive to the needs of the customers.”
There are other ways to maximize investment on the Alcatel-Lucent blog post, but these are ones that clearly should be top of mind.
As leaders in Europe debate whether the EU is “back” during the World Economic Forum, the region is increasingly falling behind when it comes to telecommunications, according to Alcatel-Lucent CEO Michel Combes.
“There is a real danger,” noted Combes in a recent blog post on Europe’s digital divide (published in the Wall Street Journal, “that Europe is losing ground in the information era.”
That’s because there is an increasing gap between what the latest smartphones can deliver and what Europe’s telecommunications companies can support due to a price war that inhibits infrastructure upgrades.
“Europe is locked in a vicious circle of competition focused exclusively on price, one that forces operators to reduce their investments and destroys their innovation capacity,” noted Combes. “This type of competition is bad news for a digital Europe and its consumers.”
The digital agenda in Europe needs to be reset by the likes of the European Telecommunications Network Operators’ Association (ETNO) and others. Telecommunications investment in the order of between €110 and €170 billion will be needed by 2020 if the region is to keep pace with the rest of the world in terms of cellular infrastructure and innovation. Failure could cost Europe €750 billion in lost GDP growth, and as many as 5.5 million highly-skilled jobs for young qualified European graduates.
“That’s a high price to pay for accepting life in a slow-motion telecoms world,” he noted.
What Europe must do, first and foremost, is move to an all-IP network infrastructure, supported by a virtualized infrastructures based on cloud technology.
Combes also suggested that Europe must invest more in applications and analytics and capabilities such as SDN and NFV.
“Today eight out of the top 10 global Internet platforms are American,” he wrote, and the two others are Chinese. “A new model of international work distribution seems to be taking shape in which the profitable operators are in the U.S. and the American Internet platforms are taking most of the residual value in Europe, while the application development centers are in India and the manufacturing is in China.”
Europe led the way when it came to 3G deployment, but now it is being left behind.
To fix the problem, operators need to end a competition model that is only based on reducing prices in the short term. Spectrum allocation also needs to be reviewed, and shared and efficient policies on net neutrality must be crafted to allow operators to differentiate themselves and revive investment.
“We also need to come back down to earth and stop thinking that the telecom sector can continue with 120 operators in Europe, subject to rules and procedures that change from one country to another,” Combes boldly wrote.
If Europe is not to fall too far behind, its digital agenda must tackle the recent decline in telecommunications. Importantly, as Combes stated, it must do so with a sense of urgency and purpose.
]]>But only a little of its promise.
The most futuristic M2M scenarios remain largely limited to intranets of things, ranging from the home to the intelligent city, production systems such as electricity, or just stand-alone intelligent objects intended to provide dedicated services.
“Such cases are still relatively simple, with a limited range of objects and behaviors which are generally designed and calibrated in advance,” noted Mathieu Boussard of Bell Labs recently in an interesting posting, The Internet of Things, a natural (r)evolution.
The real game-changing M2M applications won’t start appearing until internet-enabled objects can be brought together in scenarios not yet planned by their manufacturers—or “even to take it upon themselves to collaborate for a shared purpose,” according to Boussard.
Getting there will take software advances and innovation from network operators. It also will take hardware advances.
The M2M of the future that will drive enterprise solutions needs to be self-powered, have embedded control software, processing and communication capabilities, and ideally provide for human interaction, according to Boussard.
On the network side, M2M devices need to be identified and given an address, data routing needs to be provided in a changeable context (mobility, unavailability), they need to be made interoperable with each other and with technologies with a wide range of purposes (such as energy efficiency, or performance), and to handle a range of service quality requirements, among other needs.
“And from a software/system point of view, the resulting ecosystem of ecosystems needs to be managed – including managing resources, the data produced (and its distribution) and the network formed by these objects, where possible using all the self-organization that can be harnessed in a highly dynamic and heterogeneous context,” concluded Boussard.
To make the M2M revolution meet its promise, it will also take more than just a single vendor. Bell Labs and Alcatel Lucent will play their part, offering innovative M2M solutions such as Alcatel-Lucent’s Motive Machine-to-Machine Platform.
“ But the test will be our ability to grasp all the complexity of this evolutionary change – just as no one ‘owns’ the Internet ecosystem now, so no single player will be able to build the Internet of Things alone,” wrote Boussard.
Ultimately the challenge for M2M to reach its promise is not technological. It is collaborating effectively.
]]>The burgeoning of machine-to-machine (M2M) applications in our increasingly connected world — partly characterized as consisting of an “Internet of Things” — has made telecommunication companies look to diversify their M2M offerings beyond what can easily become ones based primarily on commoditized connectivity.
A recent Alcatel-Lucent Enriching Communications article, “The 5-Ps of M2M Key to Service Provider Success,” describes the five “P’s” as:
They are based on findings of research firm Analysys Mason’s recently published, “M2M Communication Service Provider Scorecard: 2011.”
The potential for the success of M2M for service providers is evidenced also in work done by Frost & Sullivan. They say M2M connectivity revenue in Europe, which was 3 percent in 2010 and 4.2 percent in 2011 of M2M revenues, will grow significantly to more than 20 percent by 2017 as the monetization of M2M data drives the aggressive growth in the forecast.
Prioritize Opportunities
Careful prioritization of these opportunities will yield healthy, profitable businesses, according to Analysys Mason. The emphasis is on the word careful because some of the highest revenue-generating M2M applications can generate low profitability for service providers. Hence, prioritizing which sectors to move into based on profitability is critical. This is particularly true given that by 2020 there will be 2.1 billion network-aware devices with 90 percent connected via wireless networks. However, prioritizing which sectors to get involved in and measuring profitability will not be an easy task.
Proper Placement
The most successful service providers have overall M2M organizations of 50-100 employees with centralized staff for R&D, partnership management and product marketing, according to Analysys Mason. Certain resources will need to be centralized including R&D, partnership management and OSS/BSS support, but variances among organizations means they need to consider, for example, where the technical pre- and post-sales resources should be placed and what the size or headcount of each M2M functional areas should be.
Participation
Based on the Scorecard, there are three ways for service providers to participate in the M2M supply chain, which include: co-selling a partner’s solutions; selling/reselling a service provider’s own solutions; and acquiring solutions
Alcatel-Lucent believe service providers should partner to provide M2M hardware (modems/modules and equipment) and they should sell or resell their own connectivity, platform and integration services.
Partnerships
Due to the enormous number of opportunities in the market, no one single service provider has all of the resources and tools to offer an end-to-end M2M service. Therefore, partners are necessary as service providers develop their M2M market approach, according to Alcatel-Lucent.
Carefully selecting the right partners – rather than having the biggest number of partners – is critical in developing a profitable M2M business. To select the ideal partners, service providers need to consider their options to include geography, market sector and M2M application.
Persona
The fifth “P,” persona, could be the most important consideration for service providers looking to implement a successful M2M product for their business. Associating your company as an M2M provider is critical to market success.
Analysys Mason recommends a two-pronged approach to developing a service provider’s persona. Service providers first need to develop a strategy to determine where they will participate in the M2M value chain, which will help them create their M2M persona and build brand awareness with potential buyers and partners.
Knowing who you are – understanding perceptions and realities
In addition, service providers need to engage in market research to understand existing market perceptions of their M2M persona. As Alcatel-Lucent points out, “service providers that do not create a strong M2M persona may be overlooked in favor of a systems integrator (SI), other IT channel partner or an IT vendor.”
]]>