Operators at Inpac were struggling to simultaneously fill, quality check and pack bottles during production. The two production lines fill, cap, and label 125 bottles each per minute. With several thousand bottles passing down the line every hour at breakneck speed, small faults were going undetected. Data Ductus was contacted to develop an autonomous detection system to solve this.

In brief

Challenge 

Help Inpac ensure they only ship bottles that are filled to the right level with the caps screwed on properly and with correctly attached labels that have a clearly visible date stamp.

Solution

Cameras inspect a bottle in its entirety and lighting enhances the appearance of any present defects. Custom-software analyzes images and signals an ejector when a faulty bottle is detected.

How we did it

Together with Inpac, we designed, developed and deployed a unique solution to fit the bottling lines. This included designing image analysis algorithms to detect and reject faults and training AI to recognize the different parameters.

Benefits

Thanks to the autonomous detection system, Inpac has drastically improved the output quality of the production lines. A Human-Machine Interface, HMI, alerts operators when something is amiss in production, effectively reducing wastage and increasing efficiency.

About the client

Inpac provides packaging solutions for pharmaceutics, probiotics, dietary health, and nutritional supplements. In addition, they are also a manufacturer and supplier of health and nutritional supplements. The company is privately owned and headquartered in Lund, Sweden.

Identifying a need

Inpac fills and packages large quantities of bottles every day, which are then distributed worldwide. Each customer has their own quality requirements and expectations of the delivered product. What is considered a minor flaw by some, might be a deal breaker for others. Thus, high quality at delivery is critical to minimize complaints and maximize customer satisfaction. This begins with quality on the two production lines.

Prior to the project, there wasn’t a dedicated quality inspection process in place. Instead, operators did their best to inspect the bottles when packing them. With a production rate of 125 bottles a minute per line, this was very challenging. Defects were, unsurprisingly, being packed and delivered. Inpac identified this as a weak link in their quality assurance process and turned to Data Ductus for help to solve the issue.

Finding a solution

One key factor in designing an effective autonomous detection solution was the close collaboration between Data Ductus and Inpac. Combining Inpac’s process expertise and Data Ductus’s machine vision expertise, streamlined the entire assignment. It all started with a site visit, to clarify the conditions and any constraints.

As a first step, a requirement specification was composed and agreed upon. A well-formulated requirement specification is crucial as, in the case of Inpac, it describes the scope and the prerequisites of the system, and specifies the characteristics defining a faulty bottle. Based on this, we were able to propose a custom-made system, which we then designed to meet the unique requirements of the products and production lines.

Implementing the autonomous detection solution

The solution consists of carefully chosen hardware in conjunction with software and analysis tools developed by Data Ductus. Three cameras and two panel lights, placed at a specific geometry, ensure that all present errors are visible in the captured images. When a faulty bottle is detected, this is signaled to an ejector, which removes the bottle from the conveyor belt.

The image shows three images of bottles where two are faulty and one is approved.
These are three examples of images. A red frame indicates a problem while a green frame means that the bottle is approved. The one on the left is faulty due to wrong filling level. The one in the middle is faulty due to a creased label.

The image analysis algorithms are designed to measure label position and filling level, and detect defects such as label creases and caps that are not properly screwed on. To adjust the sensitivity of the algorithm, so only faulty bottles are rejected, several parameters had to be tuned. If batches or baseline specifications change, AI methods can be used to find optimal parameter sets based on training data of correct and faulty bottles.

Ensuring stable operation

With an hour of downtime corresponding to 15,000 bottles, Data Ductus offers a wide range of cost-effective support agreements. Diagnostics and planned maintenance can be performed with remote access to the autonomous detection system. Most incidents can be resolved remotely by analyzing error logs and collected data. In some cases, on-site operators can get guidance to solve an issue. If hardware has been damaged, replacement may require on-site visits.

Enjoying the benefits

Machine vision systems collect a lot of valuable information from production lines. We compile statistics from the bottling process and present them in an easily accessible graphical user interface. This allows Inpac to better understand their production line and identify bottlenecks or detect irregularities before they impact the quality of production.

Machine vision for quality control

Inadequate quality control leads to product inconsistency, unhappy customers and can create bottlenecks in production. At many facilities, only a fraction of manufacturing output is submitted to quality control in order to save time and money. The majority of products go uninspected, even though supplying faulty products can be costly and hurt a business. Machine vision inspection changes this.

A leading provider of wooden flooring and furnishing technology turned to Data Ductus for a new innovative vision system, which has been fully integrated into their production facilities.

In brief

Challenge 

Quality Control staff using the company’s systems were unable to consistently identify small defects in flooring during high-speed production, and wastage was too high.

Solution

Boards are scanned by a 3D scanner on a conveyer belt. A newly developed vision platform collects, analyzes and visualizes the data, and informs machine operators about defects. All data is categorized and centrally stored for later use.

How we did it

We worked closely with the customer to develop the initial machine vision solution. Today, agile workflows are used to seamlessly incorporate additional capabilities and intelligence into the solution which has become a fully integrated part of their production.

Benefits

Wastage has been greatly reduced and manual inspection stations have been minimized. The solution can be incorporated into the production practice and implemented at different manufacturing plants. Additionally, systematic production problems can now be identified and resolved.

About the client

The customer is a leading provider of wooden flooring and furnishing technology.

State-of-the-art production

The company wanted a state-of-the-art production line, with minimum wastage and resource requirements. During manufacturing, multiple floor tiles are produced on a 2 x 2.4 meter hardened wood composite core, with a thinly applied veneer layer, before being cut to size. Heat and high pressure can cause blistering, bending, and other damage to multiple or single tiles during the production process.

Manual inspection of the boards is challenging due to the size and speed that the boards are produced. It’s virtually impossible for a person to consistently identify small but crucial defects with just a few seconds to inspect them, especially as a defect can be smaller than one square centimeter and protrude less than 0.5 millimeters from the wooden surface. To consistently identify such defects, an effective machine vision system is required to rapidly and accurately inspect the boards.

Inspecting the details with 3D scanning

Since the flooring has extremely low tolerances, the choice of hardware for data collection was critical. Minute height differences – as small as 0.03 millimeters on the veneer edges – had to be recorded and analyzed. The boards also had to be inspected while traveling on a conveyor belt. Therefore, a 3D scanning solution was deemed most appropriate.

A new vision solution

A new vision solution was developed to collect, analyze and store the data generated by the 3D board scanner. This provides a range of benefits. For instance, each board is marked with a unique identifier, to which the stored data is coupled. With this data, single tiles within a core panel can be automatically rejected if they have a defect, at a later stage in production. Previously, the whole panel was rejected, now, single tiles are discarded once the panel is cut into tiles. Additionally, machine operators are informed of any defects and can make an instant decision on how to proceed.

All measurement data is stored in a central database and is available for post analysis. Together with other process data, it can be used to detect and resolve systematic problems in production.

A collaborative partnership

What began as a vision system project, has now evolved into a long-term partnership. New measurements and features have been added and the solution has been fully integrated into the production process. Collaborative agile workflows enable the scanner to be rapidly adapted for parallel projects while small on-the-fly improvements can be quickly implemented.

A future-proof vision system

The new vision system provides objective inspection of products, with high traceability in the form of quantification of the measurements stored in an ordered and accessible format. Automated sorting of defect tiles means wastage is minimized. Furthermore, the need for manual inspection has been greatly reduced, freeing up resources in manufacturing. Finally, the ability to identify issues in production, provides an innovative tool for identifying production improvements and alerting staff about potential upcoming service requirements.

A global air-purifying manufacturer was unable to explore and take advantage of new digital opportunities due to its reliance on IT legacy systems. Bringing IT up to date through a cloud architecture design was imperative to safeguard the market position and help grow the business as part of the company’s digital transformation.

In brief

Challenge 

The company’s existing IT solution was holding back innovation and production was not as efficient as it should be. Competitors were incorporating new functions and services in their products and deploying them much faster. As the market leader, the company had to act to secure their position, grow sales and modernize the entire product lifecycle.

Solution

The new architecture design connects the entire ecosystem – from enterprise resource planning to user app control – in a secure, multi-cloud – AWS, Azure and Alibaba. It supports device management, online payments, real-time sensor monitoring, and advanced Business Intelligence tools.

How we did it

We worked closely with senior management at the company to define a digital roadmap. We designed the cloud architecture to support this and developed firmware in parallel to avoid delays in delivery.

Benefits

User data is now aggregated and analyzed to guide product development and improve the user experience through better app features and proactive air controls. Automated diagnostics enables issues to be fixed quickly online. Data security and integrity is assured. Additionally, manufacturing is more efficient, as automated processes have been incorporated into production.

About the client

The company is a global manufacturer and provider of air filter solutions for commercial and homeowner use.

Data-driven architecture design – from mobile apps and enterprise resource planning systems to data lakes and protocol adapters

A leading IT environment for a market leading company

As the leading provider of clean air products, the company’s solutions are renowned for their ability to filter air to provide protection against harmful pollutants. The company’s engineers are highly skilled in developing products that do this silently and effectively. However, they do not have the inhouse competence to develop an IT environment that can fully utilize the opportunities on offer from digitalization. Data Ductus was challenged with analyzing the company’s existing IT platform and designing a new, futureproof architecture that would put the company’s IT in the same leading spot as its market position.

“We were approached by senior management to look holistically at how IT is used today, analyze the potential opportunities IT can deliver, and develop a roadmap for the coming years. This included everything from giving users a first-class digital experience when using the product, to assessing how changes could be implemented in production, to creating a secure environment for sensitive user data.”

Mario Toffia, Data Ductus Senior Architect

A culture of user-centricity

The assignment was run from a user-centric perspective. Providing clean air to businesses and consumers – in a simple and service-minded way – was the driving force of the cloud architecture design (see figure below). This meant developing a simple onboarding processes for end customers and identifying intelligent monitoring and analytics tools and processes to gain a better understanding of users, support predictive online maintenance and guide new product/feature development. Additionally, it covered the lifecycle of the product from supply of parts through to remote maintenance.

Providing users with clean air and a great user experience were at the center of the new digital roadmap, from which different key elements were addressed. *Enterprise Resource Planing

Holistic and fine-grained architecture

The team from Data Ductus – who worked closely with the client’s tech team throughout the assignment – consisted of a Senior Solution Architect, and a small development team with cloud, IoT and firmware expertise. The cloud architecture design included overreaching plans for how IT can support the business, all the way down to specifying protocol adapters, sensors, API gateways, identity management tools and much more (see figure below). The architecture design and specifications had to be comprehensive and intuitive in order for third-party utilization or indeed for another provider to deploy the service. Development of cloud functionality, APIs and firmware was also done iteratively and in parallel to the architecture design, so as not to slow the project down. This included everything from system development and testing, to device monitoring and control, to configuration and event management.

The comprehensive and intuitive architecture overview was derived from a multi-cloud infrastructure.

New product line

Due to the limited IT capabilities of existing products, the decision was taken to build a completely new “connected” line of products and an IoT platform to support them. This enabled monitoring of local air quality 24/7 from which push notifications could be sent to users and/or purifiers. With the new product line, much more data can be captured, providing more fine-grained data on user activity, such as why and how users adjust fan speed and the way in which they interact with the app. Regional user patterns can also be identified and analyzed.

Push notifications are used to engage users in the app to nudge them into using more product features and ultimately become more receptive to in-app campaigns.

Cutting-edge, cost-effective data management 

“The system was designed to use two types of databases, one for events – since they are partially unstructured – and a time series database to capture telemetry bound to time, such as when air particles reach PM2.5 – the point at which a person’s health can be negatively impacted,” explains Toffia. “This provides a cost-effective way to run algorithms, and store, analyze and present data.”

The system design supports a configuration approach from which telemetry and events get pushed to the cloud. Fast analytics and machine learning tracks are done in real-time using Kinesis technology for streaming analytics. Batch processing is implemented via a data lake, whereby the data produced in the IoT cloud is exported to the data-lake every 15 minutes to reduce load and cost. The data is then analyzed and normalized using BI analytics and charting tools (including dashboards). Time constrained data is archived in slow to retrieve, low-cost storage.

Security

Security was paramount to the assignment. With much more user data being stored and analyzed, data integrity and compliance had to match capabilities. This challenge was compounded by the fact that users are located around the world, where differing regulations must be adhered to. To ensure customer data remains secure in the cloud – according to strict regulations – advanced certification, encryption and firewall services were incorporated.

Smoother and more secure production

The company wanted to open up its production processes to support a broader supply chain of air filters and purifiers as well as create a smoother flow between operations and production. To support this, enterprise resource planning systems were connected, and device provisioning was created for improved configuration and security. This system enables the secure production of units while prohibiting design theft or production of pirate filters and devices. This is central to the company’s multi-supplier model for filters, fans and other parts. 

In conclusion

“By working closely with senior management and developers at the air filter company, we were able to achieve a great deal before the project was handed over” continues Toffia. “It was great to collaborate with an organization that is working to provide a better future for people through the air they breathe. The company was focused on end customers and ensuring they get the most out of air purifying solutions, but the IT structure didn’t match this. Not all companies have got so far in the journey of understanding that their IT system can be the key to delivering on customer expectations – but they had,” he concludes.

A digital bank wanted expert support for identity management from a trusted service provider that could deliver secure authentication. Our Identity as a Service solution, IaaS, with its proactive service desk proved ideal as it enabled the bank to focus on core IT activities.

In brief

Challenge 

The bank wanted to secure external expertise to support, update and manage, gateways, firewalls, and its Curity Identity Server. This would enable it to focus on creating revenue driving services rather than the upkeep of essential authentication functions. 

Solution

Identity as a Service includes automated releases, deployments and workflows for best practice deployment and management. Proactive service desk support ensures systems run smoothly and securely.

How we did it

Management of the firewalls, gateways and the Curity Identity Server was transferred to an automated environment, according to best practices. This was done by the same 2nd and 3rd line development team that is now responsible for operations.

Benefits

Product updates are deployed regularly and on time, patches are deployed to thwart potential breaches, additional expert resources can be called upon if needed, and the bank has access to Data Ductus Curity/IAM expertise and support.

About the client

A Swedish digital Bank that provides customer-centric services to organisations and individuals across the Nordic region. The listed company has a strong focus on data science.

Managing and automating authentication services at a leading consumer bank

Maintaining high availability while keeping an authentication solution updated and secure is a big challenge for all highly regulated organizations. Regular upgrades and patches are critical for trouble-free customer login and to mitigate breeches. However, with so many other IT considerations to take care of, authentication solutions can be pushed down the priority list. The bank wanted to ensure authentication and authorization received the same high priority as its other IT requirements. The most effective way to do this is through a partner who can automate many of the authentication processes and provide a professional service desk function. Data Ductus was identified as the best company to do this through its Identity as a Service offering.

Identity as a Service (IaaS) 

IaaS was developed to support medium to large organizations in regulated industries with their identity platform needs. Typically, customers have identified their preferred solution or are already using it, but they don’t have the technical competence or bandwidth inhouse to fully manage this business-critical service.

The Curity Identity Server 

Data Ductus has been a Curity partner for over six years; supporting organizations and enterprises that are dependent on strict API security within energy, banking, retail and communication. According to Stefan Nilsson at Curity, “Several customers have approached us and asked for support when internal changes have left them without the resources to effectively manage the Curity product. Customers remain happy with the platform, but they no longer have the inhouse expertise. In such cases, transferring responsibility to a certified partner, such as Data Ductus, makes complete sense.”

Taking identity to a new level 

The bank had been using the Curity Identity Server for three years when they approached Data Ductus. The solution worked smoothly, and they had a well-organized developer organization to manage it. However, they wanted to use more of their internal resources for developing new services. Additionally, they didn’t want to be dependent on inhouse identity expertise, and therefore decided to secure external competence through IaaS.

Secure authentication

Together, experts from the bank and Data Ductus began setting up IaaS. The transition was finalized within the bank’s infrastructure three months later. The secure authentication service includes proactive maintenance and support to maintain security, identify and resolve potential issues before they escalate, and handle incidents quickly.

“Whenever we begin working with a new customer we carry out a thorough analysis of their identity needs,” explains Per-Gustaf Stenberg, Solution Architect at Data Ductus. “We also identify which workflows and processes can be automated to improve operations, and implement technical and process best practices as standard. Additionnally, measurable and achievable SLAs are defined and agreed upon, and clear lines of communication are set up to ensure a transparent and effective collaboration”

Lifecycle management 

IaaS for the bank comes with lifecycle management – including hosting, support and license management. The dedicated service desk includes 2nd and 3rd line support for day-to-day management, ticket handling, new releases, and special tech support cases. Monitoring of dashboards and logs is automated. This includes generation of incident reports with actionable items. For full transparency, the bank have access to dashboards, tickets and response data.

According to Joacim Claesson, Service Account Manager at Data Ductus “Good collaboration is central to our relationship with the bank. Updates, configuration changes, patches and artifacts are automatically deployed on the staging site for testing before final approval by the team at bank. This level of security is a requirement for a bank, but so too is the contact between us. They need to know that we’re reacting to potential threats by updating the system, and we need to know they are available to approve them. Additionally, we develop the required artifacts for security purposes.” 

Artifacts can be developed by our team or developers at the bank. Delivery and deployment to the staging site have been automated.

Wide ranging benefits

The IaaS contract includes out-of-office hours service desk support, something that the bank didn’t have before. Additional benefits for the bank include:

  • Access to security experts with extensive authentication and Curity Identity Server experience 
  • New automated workflows  
  • More time and resources to focus on developing new services 
  • Fully transparent collaboration

The security team at the bank don’t have to worry about authentication anymore. No matter which device a customer logs in from they can be assured that their identity platform will handle the authentication and that customers will be able to access their accounts – a service customers rightly take for granted. Essentially, everything works as well as it used to, but the whole process is much more efficient and updates are made more regularly.

Anders Essner, Business Manager at Data Ductus.

How can we help you? 

Do you need help with authentication or any other security services? Get in touch and find out how we can help you.

A leading provider of medical equipment serving hospitals, clinics, and laboratories within diagnostics, in both human and veterinary hematology. Their products are sold in over 100 countries and are noted on NASDAQ.

In brief

Client’s business challenge 

To reduce maintenance costs and provide higher value for their end customers by means of connecting diagnostic equipment to the cloud.

How we did it

We worked tightly together with the client’s internal medical software expertise and provided technical guidance on how to architect a cloud based platform for managing vast number of medical devices across the globe.

Our solution

A requirements specification for a healthcare grade IoT cloud solution, compliant with regulatory frameworks for patient, software and hardware life cycle management. (HIPAA, ISO62304, ISO14971)

Benefits

We bridged the gap between the client’s internal expertise on medical software and cloud technologies available. The resulting requirements specification will enable the customer to take the next step towards digitalizing their products.

We provided R&D as a Service solution to Easymeet to develop.their meeting solutions for a secure, cloud based method of holding congresses, and union and shareholder meetings – where voting is critical.    

In brief

Challenge 

Transform the current on-site delivery model for congresses to a cloud-based service.

Solution

Easymeet’s SaaS enables the users to manage their digital meetings online and includes voting, document management, attendance control, video conferencing, and much more.

How we did it

Data Ductus R&D-as-a-Service team created the groundbreaking functionality in Easymeet. For example, Cisco Webex video conferencing was integrated into the solution.

Benefits

Easymeet’s customers can now create and manage congresses as well as other formal and non-formal meetings remotely with less effort and less environmental impact. Video conferencing and digital meeting support is now easier than ever.

About the client

Easymeet is the leading provider of digital meetings for unions and political parties on the Nordic market. The company serves the main unions and political parties with digital solutions for congresses and other formal meetings.

Taking on-site services to the cloud

Easymeet’s original business model for congresses was based on on-site delivery. Deploying anything from high-quality wifi to tablets, servers and a high-class digital support system onsite, Easymeet acts as a one stop shop for formal meetings. However, the demand for availability online required new technical solutions and business models. 

Securing live cast sensitive data remotely

Easymeet Online is delivered as Software as a Service, enabling customers to manage their digital meetings online by themselves. The solution supports managing the meeting agenda, attendance control, voting, document management, and video conferencing. Easymeet Online leverages cloud technology for flexibility and availability. 

R&D as a Service

We worked closely together with Easymeet to define requirements and the general architecture of Easymeet Online. The close collaboration continued throughout the development phase where Data Ductus R&D-as-a-Service was utilized.

“Data Ductus helped us create the groundbreaking functionality that can take our services to a new level and also enable deliveries to new customer segments.“ says Frans Eklund, CEO of Easymeet. 

Our experts on Cisco Webex video conferencing integrated it into the solution. We also assisted Easymeet with deployment to the cloud and maintenance of the solution. “This is a good example of when our 360° perspective of software development – in the form of R&D as a Service is utilized.“ says Carl Grönbladh, site manager at Data Ductus. 

Remote meetings made e

Easymeet’s customers can now create and manage their meetings without assistance. Functionality that was previously only available on-site during larger conferences can now be used for all meetings. Having the complete solution for video conferencing and meeting support functionality gives the customer a new way to arrange meetings. Conducting meetings remote can also save time and effort, as well as lower environmental impact. 

More information about Easymeet Online is available at the following link: https://www.easymeet.se/easymeet-online/ 

As a leading provider of clinical trial software, PCG Solutions needed to extend their software capabilities. We provide medical software development that meets strict regulations as a core part of their R&D development team.

In brief

CHALLENGE

In this highly regulated market, PCG Solutions needed to extend their software development capabilities to strengthen the R&D department and ensure that software meets medical technology regulations.

SOLUTIONS

Data Ductus provides medical software engineering services in the form of a fully functional core development team. With a long-term focus, the team forms a key part of onsite PCG R&D.

HOW WE DID IT

Our skilled technical software consultants develop IT solutions based on product strategy and requirements. These comply with strict regulatory requirements such as IEC 62304, IEC 60601 and ISO 13485.

BENEFITS

With this long-term partnership, PCG Solutions can focus on their core business of cloud-based EDC technology for clinical trials and rely on Data Ductus to develop regularly compliant products.

About the client

PCG Solutions is a global pioneer in cloud-based Electronic Data Capture (EDC) technology. The company develops and markets software for clinical trials used by customers such as Pfizer, Novartis and Ipsen.

A large percentage of the world’s blood tests are done using the company’s laboratory equipment. Our medical software engineering team are instrumental in the development of the company’s QIMS system for the quality control of their immunodiagnostic production.

In brief

Challenge 

Update and take over the management of the inhouse developed QIMS system. Run DevOps, including development of new functionality in accordance with standards such as IEC 62304 and FDA Title 21 CFR Part 11 regulations.

Solution

Our Application Lifecycle Management Service for medical devices, provides an experienced team to develop and maintain QIMS, while handling any software issues from a global user base.

How we did it

We analyzed the company’s existing and future needs. A core medical software engineering team with relevant regulatory and standards competence was assembled, with the option to bring in other technology expertise when needed.

Benefits

The company’s IT costs have been reduced. QIMS updates and new releases are launched faster and issues are resolved quickly. Additionally, with the transition to a more robust platform, new capabilities can be introduced.

About the client

The world’s leading provider of immunodiagnostic solutions offers blood test systems to support clinical diagnosis and monitoring of allergies, asthma and autoimmune diseases.   

Medical software engineering for leading life science company    

A large percentage of the world’s blood tests are done on the life science company’s laboratory equipment. A single machine can handle up to 40,000 tests a day. The company’s inhouse LIMS software development team developed their own laboratory information management system (LIMS). This is a core monitoring and quality assurance component of the business offering, handling many of the internal workflows for processing tests and evaluating results.

As technology has advanced and regulations have become more complex, the life science company has had to allocate more and more resources to IT – often from multiple external consulting companies. In order to focus on its core competence of immunodiagnostics, the company began a search for a new long-term medical engineering software partner to develop and manage their QIMS system, which ensures the quality control of their immunodiagnostic production..

A highly regulated market  

In the highly regulated healthcare industry, companies can be severely penalized when software isn’t up to standard. Therefore, choosing the right medical software engineering partner is business critical. They opted for Data Ductus’s Application Lifecycle Management Service for Life Sciences, which included administration, implementation and development of QIMS, as well as updating and streamlining the service within the regulatory framework.

A dedicated team of medical software engineers 

With the parameters set, an extensive analysis of  existing needs as well as the company’s IT roadmap for the coming years was undertaken. A dedicated team of medical software engineers with relevant experience from IEC 62366, IEC 62304, ISO 14971, ISO 13485 and MDD 93/42/EEC (now replaced by 2017/745) and FDA Title 21 CFR Part 11 was assigned to the case with the option of bringing in other domain and/or technology expertise when needed. 

To ensure a smooth transition, the Data Ductus team spent three months onsite with the existing QIMS team before assuming responsibility for the product. This included development, management and handling of software issues from the global user base. Since transition, Data Ductus has delivered four new releases into production.

Lower Costs and Faster Releases 

The tight collaboration between the product owner and the QIMS team at Data Ductus ensures any third line support issues are quickly resolved. Data Ductus Software developers and the company’s Product Development team  are also aligned when it comes to updates, new versions and the general direction of QIMS. Releases are now delivered on schedule, while maintenance costs remain within budget. Additionally, whenever new capabilities require additional competencies, specialist Data Ductus medical software engineers can be drafted into the process to ensure a smooth delivery. 

“Since we took over QIMS, the life science company’s IT costs have been reduced, launch times are quicker and software issues are resolved faster,” says Amin Gholiha, Business Developer, Data Ductus.” They can now concentrate on what they’re best at, immunodiagnostics, and we handle IT, which is our core expertise.” 

Analysis of iron ore phases and microstructures used to be done manually at the world’s largest iron ore producers. We collaborated with LKAB to automate analysis.

In brief

Challenge 

Automate the inspection process of 12 mm iron ore pellets to identify mineral content, distribution of additives, and porosity levels.

Solution

Microscopes equipped with high-resolution digital cameras photo the pellets. The images are automatically analyzed, and the results are visualized in graph form.

How we did it

We trained a machine learning pixel Classifier to classify the images and built a system to quantify relevant pellet image data and visualize the results.

Benefits

The R&D team can now dedicate their time to analyzing results rather than carrying out manual pellet inspection. Additionally, pellet analysis is now standardized.

About the client

LKAB is Europe’s largest iron ore producer, with 4,200 employees in 13 countries. The company, which is wholly owned by the Swedish state, is headquarted in Luleå, northern Sweden.  

Machine learning: Automating iron pellet analysis for the LKAB mining and processing company   

LKAB’s core business is the mining and processing of iron ore for the steel industry. Over the years, LKAB has developed a unique market offering for its customers, namely blast furnace pellets. These have now become LKAB’s most vital product – with an annual production capacity at its plants in northern Sweden, of 28 million tons per annum. The main benefit of using pellets over standard iron ore products at steel mills, is lower furnace energy requirements. An addition benefit is that the pellets contain extra minerals, such as olivine, which provide improved high-temperature properties.  

Pellet inspection  

The R&D team at LKAB are continuously searching for improved ways to produce more effective as well as customized furnace pellets. Part of this process includes manually inspecting the formation of different iron oxide phases and microstructures in sample pellets. This is a time-consuming process that must be done by experts, who examine sample pellets embedded in epoxy, through an optical microscope. Some minerals can be identified by color and intensity, such as magnetite and hematite, while others are based on pellet surface texture. This process lends itself perfectly to machine learning.  

Automating inspection 

Our goal was to automate this process. The first step was to create a dataset of high-quality microscopy images of representative iron ore pellets. Experts at LKAB provided us with sample pellets embedded in epoxy and access to automated microscopes – each equipped with high-resolution digital cameras – at their facility. 

Creating an image library 

High-magnification images of the sample pellets were acquired, and multiple images were pieced together into large mosaic images to create a complete view of each pellet. Depending on the size of the pellet, the images ranged between 500 and 900 mega pixels.   

Training a pixel classifier 

Once the image library was created, the images were annotated in close collaboration with analysis experts at LKAB. Regions from each class were marked in a set of images and used to train the pixel Classifier. The Classifier was subsequently used to classify the images and the annotation was improved by correcting the classification. This was repeated until a satisfactory classification was achieved. The Classifier was then evaluated on a set of images that were not included in the training process. 

Intuitive pellet analysis 

The last step was to extract and quantify relevant information from the classified images and create a visual format that could be easily interpreted by all members of the pellet analysis team at LKAB. The result was a series of graphs depicting a mineral map, the microstructure and the mineral content of a pellet. 

Thanks to the automated mapping of sample pellets, experts can now dedicate their time to analyzing the results, rather than inspecting the pellets. This has speeded up the analysis process considerably. Furthermore, analysis is no longer based on one person’s viewpoint, it is now standardized across all pellet samples – based on the collective experience and advise of the R&D team.  

Optimizing pellet properties 

The automated and quantitative characterization of microstructures can be used to generate the necessary data to better understand the effects of pellet additives and the different process parameters. Additionally, it is now possible to automate a quantitative study of the reaction mechanisms in the blast furnace based on pellet microstructures. With this knowledge, pellet properties can be optimized for different customer applications.  

“Thanks to a close collaboration with Data Ductus, we were able to define the machine learning project and move through development to implementation quickly and smoothly,” says Johan Sandberg, Section Manager, Process & Product Development at LKAB. Automating the inspection of pellets has freed up time for the R&D team to focus on more valuable and interesting work.” 

LKAB is in a massive transition phase to achieve the goal of making its mines carbon-dioxide-free, digitalized and autonomous. To support this we developed a global ITIL Service Desk with an integrated CMDB.

In brief

Challenge 

Support this transformation by creating a Service Desk as a single point of contact to provide business-enabling IT support for the home market and global divisions. This includes enabling automated help desk services and improving vendor performance and response time.

Solution

An on-premises ITIL Service Desk with an integrated CMDB that supports continuous automation through a tight-knit integration with core infrastructure services. This broadens the range of issues that can be fixed with minimal effort and enables preventive actions.

How we did it

We worked closely with the IT department at LKAB to define their new Service Desk requirements in accordance with the LKAB digitalization effort. We then replaced the existing legacy systems with an agile and centralized solution for the entire enterprise.

Benefits

LKAB now has an onsite Service Desk that is aligned with corporate digitalization goals. Supports cases are not only resolved quickly but a platform has also been established for continuous improvement and automation.

About the client

LKAB is Europe’s largest iron ore producer, with 4,200 employees in 13 countries. The company, which is wholly owned by the Swedish state, has launched the largest industrial innovation effort in the Nordics, with the goal of making its mines carbon-dioxide-free, digitalized and autonomous.

The trend of moving IT service and support offshore is being reversed. Service Desks are returning on-premises, in order to bring domain expertise back into IT service and support. One important reason, is the fact that modern IT organizations consume services from many service providers and vendors. A single point of contact coordinating all these services becomes increasingly important. Service Desk specialist providers manage and monitor the organization’s multiple vendor services and agreements, and thereby enable continuous improvement.

A business enabling Service Desk

LKAB wanted a new type of Service Desk – one that could look beyond the existing ethos of resolving issues by also functioning as a proactive business enabler, in line with the company’s digital transformation. An immediate challenge was to improve control over the multi-vendor performance and the reporting of Service Level Agreements (SLAs). LKAB required a solution that would remove the silos of vendor responsibility to ensure all IT queries were handled promptly. The company’s existing IT Service Desk could not provide the required transparency or cater for the Service Desk needs of the different divisions around the world. LKAB, however, did not want to take on the risk of building and staffing a new Service Desk to meet their requirements. In 2014 they issued a public tender for Service Desk Management.

A new user-centric ITIL Service Desk

We approached the challenge from the viewpoint of future needs, i.e., the mining company’s processes and procedures and the key Service Desk requirements. This resulted in:

  • A fully manned on-premises Service Desk that handles support via phone, e-mail or digital interface.
  • An agile service that supports infrastructure, vendor and SLA changes. Fully-automated SLA reporting.
  • Full compliancy with existing and soon to be enforced regulations, such as GDPR.
  • One Service Desk for the all markets and divisions.

In 2015 we launched the new ITIL Service Desk. In total eight people man the desk, which is open between 6am and 6pm. All services, manuals and interfaces are available in Swedish or English.

Fully transparent Service Desk operations

LKAB now has full transparency over the Service Desk activities of its IT vendors. All SLA reports are stored securely and can be accessed by authorized LKAB staff. New SLAs can be easily added to cope with changing conditions and vendor contracts. Additionally, data such as Standard Operation Procedures (SOPs), troubleshooting guides, knowledge articles and previous query requests have all been transferred from the legacy system to the new platform. And as for vendors, they have more time to focus on high priority projects as the Service Desk team handles most routine enquiries. Furthermore, vendors also now share more information with staff at LKAB and with other vendor partners to support a more transparent process.

What the customer says 

“One year after deployment we carried out an in-depth evaluation of the new Service Desk. LKAB employees around the world were asked to rank the performance of Data Ductus according to a number of key criteria including response times and resolution times, and feedback to end users regarding progress. The Service Desk team received very good scores. Essentially, people are very happy with the new service. Data Ductus is a transparent company, providing us with a transparent service, which is exactly what we wanted.”
Head of IT-Operations Rory Wikman, LKAB, Kiruna.  

How can we help you?

Do you need help with transforming your IT Service Desk? Learn more about what we offer within Service Desk and IT Service Management and get in touch today!

Photo by Kiruna kommun

Self hosted data centers were operated by multiple partners at the Church Of Sweden. As part of their digital transformation, we planned and carried out a seamless data center migration in a detailed and coordinated four-step process.

In brief

Challenge 

Seamlessly migrate self-hosted data centers – operated by multiple outsourcing partners – to a new Infrastructure as a Service (IaaS) model, without interrupting operational services or exceeding budget.

Solution

Together with Boston-based Transitional Data Services, we ran a controlled transition and transformation of thousands of IT components, applications, databases and complex integrations, in four migration events.

How we did it

After an inventory of the IT infrastructure and an analysis of the dependencies, we created a detailed transition plan. Teams – on the “old” and “new” sites – followed a coordinated task list during each migration phase to ensure smooth transition.

Benefits

During transition, there was minimal or no interruption to IT services and business processes. Time frames and budget limits were kept. Additionally, IT governance has now been greatly improved due to the mapping of dependencies.

About the client

The Church of Sweden is the country’s largest religious community. It has more than 3000 churches, 1300+ parishes and 13 dioceses. With over 6 million members, The Church of Sweden has an annual budget of around 1.4 USD billion.

With more than six million members, The Church of Sweden is the country’s largest religious denomination. Due to a long history as a former state church and the duties that came with that, such as managing the national resident registry, the organization runs a large IT infrastructure. The Church of Sweden still has obligations mandated by legislators which requires a complex set of IT services.

The complexities of cloud transition

Back in 2017, The Church of Sweden had contracted a new Infrastructure as a Service (IaaS) provider to replace its self-hosted data centers. These were hosted in multiple locations and operated by outsourcing partners on behalf of the Church. However, the transition turned out to be far more complex than originally anticipated. This resulted in the original project plan being rejected due to the risk of service interruptions and failures during the transition itself.

Creating a migration roadmap

At this point Data Ductus was contacted to facilitate the move. Together with our partner, Boston-based Transitional Data Services (TDS), we created a project plan based on TDS best practices and tools. We carried out interviews with subject matter experts and system owners to establish a complete view of the complex IT infrastructure(s) impacted by the move. We then made a complete inventory of IT components and a thorough analysis of their dependencies, before planning the necessary transformation. At this point we identified the need for four separate migration events to minimize the impact of the transition on the organization.

A smooth transition

With work packages prepared for the four separate move events, we put teams at both ends – the “old” and the “new” sites – during the migration, to facilitate a smooth transition. With complete and coordinated task lists for all staff members involved, transition was executed in a few hours without any interruptions, despite all the interdependencies. Furthermore, the entire procedure was achieved within the budget and timeframe.

Improved IT governance

The Church of Sweden also gained additional value from the project in the form of improved control and IT governance. The dependency analysis and the separation of concerns enabled by the transition, allowed them to plan a more efficient and secure operational target environment. All data sources are now securely separated through efficient network segmentation and information owners can rely on the data in their updated CMDB.

How can we help you?

Do you need help with your IT infrastructure transition project? Get in touch today!

Water seepage is an issue for all mines – even disused ones. Boliden used an environmental and water management IoT solution from Data Ductus to model and analyze water flows.

In brief

Challenge 

Identify and measure groundwater seepage and consequent water flows from waste at a decommissioned Boliden gold mine and provide accurate real-time data from the isolated site.

Solution

2D/3D models and a geological map of the site rock formation and underground water flows were developed. A custom built IoT solution measured flow rates and contamination levels and transmitted the data to a central web portal for analysis.

How we did it

We took Electrical Resistivity Tomography measurements of the site and designed and built a robust IOT system to remotely measure the water. We provided a secure portal and worked closely with Boliden to analyse the data.

Benefits

New underground water flows were identified and could be monitored remotely as opposed to taking manual samples. Boliden has now taken the necessary steps to ensure the old mine rock waste is no longer a potential environment hazard.

About the client

Boliden is a high-tech metals and mining company with over 5,000 employees throughout Europe. With its origin and part of its operations in Boliden, Skellefteå, northern Sweden, the company has been mining and smelting metals for close to 100 years.

Challenge 

Identify and measure groundwater seepage and consequent water flows from waste at a decommissioned Boliden gold mine that – due to seepage from the oxidizing waste rock – has been listed as a potential hazard to the surrounding environment. Provide accurate real-time data from the site that is located in the middle of a forest, has poor cellular coverage and extreme weather conditions, ranging from -30C to +30C. 

Solution 

Advanced 2D/3D models of the site and surrounding area were developed to provide a full geological map of the rock formation and underground water flows. A custom solution based on Aqua Ductus’ battery-driven IoT water monitoring equipment was installed in run-off streams. These measure flow rates and contamination levels and send the data to a central web portal for analysis. 

Map

How we did it 

Firstly, we took Electrical Resistivity Tomography measurements of the entire site with specialist equipment, in order to generate the 2D/3D maps. We then designed, built and tested a system that could take measurements of the water and send it remotely, while withstanding the harsh environmental conditions. Finally, we provided a secure portal for Boliden to access the transmitted data. We also worked closely with Boliden at every stage of the process to analyze data and decide on the best actions to take.  

Benefits 

Thanks to the 2D/3D geological measurements, Boliden was able to identify new underground water flows. They were then able to remotely monitor surface water run off as opposed to making manual flow measurements. Based on this and other data, Boliden have been able to take the necessary measures to ensure the old mine rock waste is no longer a potential hazard to the environment.  

“We approached Data Ductus with a problem and they came back with a way of solving it. That’s exactly what you want from a partner, they don’t just roll out an off-the-shelf solution unless that’s exactly what you need. They create what’s best for you both at that particular moment in time and for the long-term.”  

Andreas Vallmark, 
Development Engineer, HR and Sustainability
Boliden 

As part of the vision to digitalize its industrial timber kilns, we worked closely with Valutec to develop a smart, long-term industrial IoT solution for harsh environments.

In brief

Challenge 

Assist Valutec in the implementation of a long-term smart control system for industrial kilns that is easy-to-use, can withstand the harsh kiln environment, and can deliver optimal timber drying conditions in accordance with different wood types and drying routines.

Solution

A robust hardware solution with a Programmable Logic Controller (PLC) connected to temperature sensors, heat coils, fans, etc. A PC monitors and controls all PLCs. The PLC and the PC are connected to a PROFIBUS or PROFINET network. The PC software uses Open Control Communication (OPC) standards to communicate with the PLCs.

How we did it

We worked closely with Valutec’s in-house team, utilizing several open source and commercial add-ons from Valutec partners such as Siemens. A test system was used as a beta site to perform incremental updates and evaluate the results before solutions were deployed at live kiln sites.

Benefits

Thanks to the simple and intuitive user interface, a kiln can be configured locally to fit many different kiln types and suit individual wood drying requirements and processes. The system uses simulation software to fine-tune its operational settings in order to deliver optimal wood-drying results with minimal energy usage.

About the client

Valutec is one of Europe’s leading suppliers of industrial timber kilns. With operations in Sweden, Finland, Russia and Canada, the company has annual sales of around 30M USD and has delivered over 4,000 kilns to the market.

Haninge municipality wanted to outsource their ITIL service desk and SIAM services to a reliable and proven provider. Our team run the inhouse service desk while SIAM services deliver an effective multi-sourcing strategy.

In brief

Challenge 

Design and run an in-house ITIL Service Desk that provides IT support and advise, as well as Service Integration and Administration (SIAM) services, and in doing so support Haninge municipality in their aim of establishing a successful multi-sourcing environment.

Solution

A governance function based on the SIAM model and an on-premise central ITIL Service Desk – manned by Data Ductus staff – integrates and coordinates the services delivered by multiple service providers.

How we did it

Following interviews with stakeholders, we defined new administrative procedures and introduced a governance model for central IT-infrastructure and communications suppliers. We then implemented and configured the ITSM suite to meet the new processes before integrating it with other central business applications.

Benefits

With an effective SIAM solution in place, Haninge municipality is now operating a multi-sourcing strategy. The result is a team of unified specialist providers that work towards predefined SLAs to improve IT delivery and drive innovation across the organization.

About the client

Haninge municipality is located on the outskirts of Stockholm, Sweden, and is home to around 86,000 people. It is responsible for education; social care; urban planning; public elections; recreation and leisure; and more, in the region.

Read more about this case here (in Swedish).

With decentralized IT across 1300 parishes, the Church of Sweden needed to update and standardize IT. We planned and ran one of Sweden’s largestIT consolidation infrastructure projects installing a standardized private-cloud/virtual desktop with two layers of services at virtually every parish.

In brief

Challenge 

Develop a nationwide IT-solution that cuts IT costs and simplifies local administrative duties, allowing church officials to focus on their core activities. Create a simple and effective, step-wise platform migration process for parishes.

Solution

A private-cloud virtual desktop with two layers of services. The first is a shared IT platform running the preferred office/admin suite of each parish. The second offers standardized bookkeeping/HR software. New services can be easily added on the Citrix-based platform.

How we did it

We carried out a pre-study of The Church of Sweden’s processes and requirements, before designing and developing the solution. Once completed, we visited individual parishes, assessed their existing IT, and helped them get connected to their new virtualized desktops.

Benefits

IT at connected parishes is now up to date and in compliance with regulations. Centralized purchasing units and Help Desks have reduced IT costs. Files can be easily shared between parishes and data is securely stored in a private cloud.

About the client

The Church of Sweden is the country’s largest religious community. It has more than 3000 churches, 1300+ parishes and 13 dioceses. With over 6 million members, The Church of Sweden has an annual budget of around 1.4 USD billion. 

A step-wise IT migration

The Church of Sweden is a vast organization with the mandate to provide church related services for the entire country. This equates to over 1300 parishes, covering an area of approximately 450,000 km², roughly the size of California. Principally, each parish in the country is responsible for its own IT. As a result, local units around the country maintained similar IT solutions and operational procedures. Parish administrative duties also needed to be simplified to enable local church officials to focus on their core activities. Following a pre-study of the organization’s processes, structure, and IT requirements, Data Ductus proposed a step-wise migration path from this fractured, costly and inefficient IT set-up towards a new shared IT infrastructure.

Two-layered IT solution caters for all needs

The IT consolidation solution is a two-layered offering that allows parishes to choose the number of centralized services they wish to use. At layer one, a virtual desktop provides each organization with the office/administrative suite it prefers. Users can access these cloud services from any device. At layer two, centralized HR and book-keeping services simplify salary payments, billing, and the like. The virtual desktop solution runs on the Citrix platform. A private cloud hosts all the data and related services. The solution includes a manned Service Desk for IT support.

To facilitate migration, Data Ductus also provides an “upgrade” service to parishes. This involves assessing the existing IT at the parish office – including everything from network connection to office computer – and producing a simple migration plan to the virtual desktop. Once approved, either the Data Ductus team or the local IT provider can carry out the upgrade.

Lower costs and better collaboration

With more and more parishes adopting the Citrix platform, IT consolidation economies of scale are driving down the OPEX of IT service operations, and the CAPEX for related facilities, hardware and network resources. Furthermore, as all the connected parishes share the same program versions, file sharing has become much easier. As a result, collaboration in administrative procedures has become a lot easier which in turn reduces administrative costs and vulnerabilities and improves overall quality.

Many of the mundane administration tasks are now handled centrally. For instance, the book keeping system includes an automated billing feature, prints invoices centrally and posts them to the recipient, all at the touch of a button.

The private cloud solution meets existing and coming regulations, such as GDPR. Finally, with one common platform, the IT organization can easily deploy and offer new services to parishes across the country.

As a side-effect, the initiative brought some considerable environmental benefits. Going from thousands of distributed servers in local data centers – often with more CPUs than needed – to a consolidated central data center, meant reducing the energy consumption considerably. Furthermore, the information security gains are obvious since risk exposure in the countless distributed data centers was overwhelming.

What the customer says

“When we began this project, it was with the awareness that parishes would have the autonomy to stick with their existing IT setup or onboard to the virtual desktop. This meant designing a solution that would reduce vicars’ administrative duties, giving them more time to do what they really want to do in a parish. However, it still had to give them a degree of control of their own processes.

Developing the two-layered solution solved this perfectly, as parishes can choose the level of service they wish to use. Well over two thirds of Church employees within local organizations are now using the solution. And I believe that once complete, we will have carried out one of Sweden’s largest nationwide IT infrastructure consolidation projects ever”, says Hans Eskemyr, the former CIO of The Church of Sweden’s national level.

Telecom service providers face considerable challenges as they try to balance the rapid and continuous changes in consumer behaviour, with the availability of new online services. We orchestrated and automated the network of one of the world’s largest telecom providers using YANG model services through the NSO’s open NBI REST API.

In brief

Challenge 

Help one of the world’s top 10 largest telecom providers to reduce time to market of new services, speed up the customer onboarding process, and minimize OPEX/CAPEX expenditure.

Solution

Service automation and orchestration using Cisco’s NSO as a service engine with VMWare as a NFVI platform, as well as Cisco’s VTS as a SDN platform.

How we did it

Working collaboratively with our partner Cisco Systems, we tailored the YANG-modelled NSO solution to meet the client’s requirements and deployed it globally. All within four months.

Benefits

The provider now has a fully automated customer service offering that can be deployed 100 times faster than the previous solution. New agile processes means they can develop and roll-out additional services considerably faster than their competitors.

About the client

One of the world’s top 10 largest telecom providers serving a global customer base.

Growing customer numbers leads to long service delivery times

In 2012, one of the world’s largest telecom companies introduced a new managed solution with secure services for enterprise customers. As customer numbers increased, the company began to struggle with long service delivery times and a continuously rising OPEX: it was taking an average of one month to deliver services to new customers and several working days to make configuration changes. Data Ductus was asked to develop a solution that would help the telecom operator stay ahead of competition. The provided wanted to:

  • Simplify the customer onboarding process to reduce onboarding times
  • Speed up time to market of new services
  • Reduce OPEX/CAPEX

YANG model services

In October 2015, the telecom operator announced the deployment of an upgraded version of their cloud VPN service, utilizing Cisco’s platform for SDN and NFVI technologies and the Cisco Network Service Orchestrator. It features automated orchestration via a customer portal that accesses YANG-modelled services through the NSO’s open NBI REST API. The solution went live in February 2016.

The automation and orchestration part of the solution uses the Cisco NSO product as the orchestration engine. This platform is model driven (YANG model) and allows a separation between the network’s service layer and its device layer. This enables networks to be programmed with device agnostic services. Accordingly, a device can be changed from one vendor to another without the need to break or change a service. In fact, the services are agnostic despite the fact that the network comprises a mixture of physical (PNF ) and virtual (VNF) networks, distributed across two geographically separated Data Centres.

The orchestrated solution

The orchestration solution is mainly divided into two major services, one for secure internet access and one for secure remote access. The customer can select between these services and configure them according to their needs, i.e. Bandwidth, QoS, value added Functions (e.g. Web security and mail security), etc. The configuration is then automatically mapped down to each individual device (physical or virtual).

The input data is automatically validated before the settings are approved, eliminating typing errors, type definition correctness, as well as validity towards global system thresholds. This type of validation, in combination with the removal of manual intervention, minimizes the risk of costly faults in the network. Furthermore, the orchestration optimizes network resources by “spinning up” new virtual network functions when they are required. This ensures Data Centre resource usage is kept to a minimum.

From hours to minutes

Thanks to the combination of Cisco products and Data Ductus orchestration design, the telecom operator can now make configuration changes to their services in minutes instead of days. Furthermore, they can do it without any OPEX involved, which provides them with considerable savings.

Time to market has also been greatly improved as new services can now be designed without concerns about the complexity of the underlying network. New agile processes also add greatly to the speedy rollout of new services.

We developed a new secure, smart home IT eco-system for E.ON consusting of new IoT components and cloud services, flexible APIs, automated process control, event management, model-based onboarding of new sensorsand analysis of the huge amounts of data generated by connected devices.

In brief

Challenge 

Since market deregulation in the 1990s, it’s been easy for consumers to switch between suppliers. Electricity companies need to offer customers a much better service than simply providing electricity. Our challenge was to help E.ON develop a way to do this.

Solution

100Koll is an app-based control system that enables customers to manage electricity in the home. The smart home solution is built on a modern IoT-based service platform. The architecture is designed for continued agile service and business development.

How we did it

Working closely with enterprise architects at E.ON, we developed a new platform for IoT services. We then trialed the service with 10,000 E.ON customers, before rolling out nationally across Sweden.

Benefits

E.ON offers customers added value through Smart Home functionality, transparent usage and billing and potential energy savings. 100Koll meets the immediate needs of the market while catering for future IT and market developments.

About the client

E.ON is one of the largest  companies in the European energy markets. The company serves many millions of people in Europe and beyond.

Market deregulation

In 1996 the Swedish electricity market was deregulated. Ever since it’s been easy for consumers to switch between providers. Competition is fierce. Energy companies faced a major challenge: convincing customers that a kWh from their company is better than a kWh from a competitor.

In the main they reacted by developing new websites to improve communication with customers. However, most had limited functionality. A standard “My Pages” menu rarely provided significant added value for people.

This remained the case for many years, but more recently the Internet of Things (IoT) has opened up new possibilities for companies willing to embrace change. We were asked to help E.ON utilize IoT and develop a game-changing service for their customers.

Secure, smart home IT eco-system

E.ON has launched a unique smart home solution. It’s built on the Crossbreed platform for IoT and service integrations which complement the existing IT infrastructure.

A new secure smart home IT eco-system was developed to integrate the existing infrastructure with new IoT components and cloud services. Requirements included flexible APIs, automated process control, event management, model-based onboarding of new sensors and related data models, as well as management and analysis of the huge amounts of data generated by connected devices.

E.ON customers also had to be provided with new hardware – such as temperature sensors – that could constantly relay data while ensuring it remained securely stored for analysis. Moreover, web and app solutions had to be developed reflecting real-time power usage on their screens while enabling  customers to control electrical equipment in the home.

100Koll

Today, E.ON no longer only supplies electricity but also an expandable service platform to complement customers’ smart homes.

Developers can now quickly and easily develop apps for mobiles, tablets and computers. These apps, or more complete applications, give the ultimate customer experience on the web or a smartphone, adding value to both E.ON’s customers and E.ON.

With the 100Koll service, E.ON offers a brand new customer experience. The user can actively be online and receive real-time information, compared to the old days when customer interaction consisted mainly of monthly invoices.

What the customer is saying

“With the help of 100koll, we simplify the everyday lives of our customers a little and improve safety in the home. And for those that are environmentally conscious, they can choose to use greener energy. Basically, 100koll is aimed at all solving many of challenges faced by our customers.”
Tobias Övall, Technical Product Manager for 100Koll at E.ON

This site is registered on wpml.org as a development site.