Current active projects
Production Line Quality Enhancement using Advanced Data Analysis
Production line processes are complex systems consisting of different steps, each subject to different conditions and parameters. As a component is produced and passes through the pipeline, certain conditions and parameters can affect the product outcome and cause defects. With Industry 4.0, the steps of the production line and all components produced are constantly monitored, and the corresponding data can be used to gain insights into the manufacturing process.
Using this data, the project aims to first describe the manufacturing process and predict the outcome (healthy or defective) as the component moves through the pipeline. And second, this data will be used to determine what factors combine to occur when a defect is discovered, to point out when these factors occur, and to potentially suggest how to react.
Network Monitoring for QoE Assessment.
Network connectivity is of paramount importance in daily life today. However, in many areas, such as mountainous, rural, seaside or less developed countries, neither broadband nor high-speed mobile networks are available yet. In these cases, the only available solution to provide high-speed connectivity is satellite communications. Due to the complexity of communications, customers may experience unexpected network peformance. To address these issues, both active and passive measurements must be performed. However, due to the cost of transmission, only a few active measurements can be performed. Moreover, the large number of devices crossed from the customer to the Internet makes troubleshooting a complex task. Therefore, the project aims to first establish an infrastructure to monitor network operations and dentify key performance indicators (KPIs) to monitor connectivity and resource utilization. And second it aims to dentify KPIs that affect the users’ Quality of Experience (QoE) and create machine learning methods to detect QoE degradation and troubleshoot using fewer active measurements.
AI4NET project – the Huawei Research Chair on AI for Anomalous Traffic analysis.
It marks the start of an exiting collaboration between SmartData@PoliTO and Huawei Datacom FRC lab in Paris on topics related to network data analysis for anomaly detection using novel approaches. We’ll investigate automatic methods to assist in the monitoring of increasingly complex networks and in the automatic identification of anomalies, in the context of cybersecurity to face always more complex network attacks. Intelligent botnets, automatic identification of anomalies in darknets, machine learning and AI to scale and automatize the analysis process. All cool topics we’ll be working on with colleagues in Huawei Datacom FRC lab, Paris.
Data-driven quality assessment of espresso coffee production.
The project in collaboration with Lavazza aims at analyzing espresso coffee production data collected from professional coffee machines to improve the quality, lower the maintenance costs, and provide a better customer experience. Different quality variables are correlated to automatically extract human-readable patterns among the available data. Results include the data-driven definition of new and better quality values and the improvement of quality classification by means of time-series feature engineering.
PIMCity – Building the Next Generation Personal Data Platforms
The Web economy has been revolutionized by unprecedented possibility of collecting massive amounts of user personal data, which lead the web to become the largest data market and created the biggest companies in our history.
Unfortunately, this change has deep consequences for users, who, deprived of any negotiation power, are compelled to blindly provide their data for free access to services. Data collection is opaque, fragmented and disharmonic, so that users have no control over their personal data, and, thus, on their privacy. Personal Information Management Systems (PIMS) aim to give users back control over their data, while creating transparency in the market. However, so far, they have failed to reach business maturity and sizeable user bases.
PIMCity offers remedies to change this scenario.
Telematics Applied to Insurance Sector: Technological Solutions Analysis and Data Management
The project focuses on the subject of telematics, specifically with data transmitted by devices known as black-boxes installed on-board vehicles that allow insurance companies (and others) to obtain information about the status of the vehicle, its performance, road conditions, routes, etc., paving the way for the creation of new value-added services. The aim of the project, which involves the Department of Automatic and Informatics (Prof. Fabrizio Lamberti and Prof.ssa Elena Baralis) and which is part of the broader initiatives related to the Big Data of Politecnico di Torino and Reale Mutua Assicurazioni, consists in:
- supporting the company in the implementation of a study of the telematic solutions already in use;
- identifying value of collected data (both aggregated and raw one);
- design of a new solution for storage and management of the information flow.
- finally, evaluating and studying possible integration of external sources’ data (e.g. geolocation, meteorological data, traffic situation), in order to “increase” black-boxes data potential.
Advanced Analytics And Machine Learning Algorithms for Predictive Maintenance on Medium Voltage Distribution Networks
The failure of elements in modern energy grid systems causes service disruption and is one of the main cause of maintenance costs. To avoid failures, periodic maintenance operations are typically executed so to identify possibly failing elements and replace them before they fail. The key idea is to adopt machine learning techniques to extract models from data, with the goal to predict feeder failures and to address predictive maintenance in the most effective way.
Statistical Models, Data Analytics and Machine Learning – Fog Computing and Opportunistic Networking
The project covers two main research areas: “Data analytics” and “Fog computing and opportunistic network”. The aim of “Data analytics” is to analyze historical data provided by Tierra Telematics about on-road and off-road vehicle usage by means of data mining and machine learning techniques. The aim of “Fog computing and opportunistic network” is to distribute resources and services of calculation, storage of data, control and network functionality on the infrastructure that connects the Cloud to the Internet of Things (IoT).
Car sharing and electric charging station placement from data
The Free Floating Car Sharing system refear to a model of car rental where users rent for a short period of time, usually a few hour or less a car. The peculiarity of this system is that users can rent and drop a car wherever in a geo-fence area. As this transportation means saw an important growh in the last years, it is important to study it. For this reason this project has two main goals.
The first goal is to study the feasibility of creating a platform that harvests data offered by the platforms of different FFCS companies, and design big data analytics to extract higher level information such as mobility patterns, customers’ habits, gas consumption, typical areas of usage, and more. The second goal is demonstrate the potentials of these Big Data analytics by designing an electric based FFCS systems (e-FFCS) leveraging actual data coming from current gas based FFCS platforms.
Data-driven urban systems modelling and analysis
Nowadays the huge amount of data available on mobility infrastructures and urban mobility calls for new models and analysis able to discover users’ habits and patterns. The SmartData@PoliTO center use the expertise accumulated on big data analysis and machine learning to analyse such data to discover new users’ habits as well as make quantitative measure on different aspect on mobility and efficiency of city processes. The development of Interactive data visualization platform allows to disseminate the scientific results to a large audience, from stakeholder to private citizens. The interactive platform devised by the SmartData@PoliTO center can be used also for testing new urban scenario and new mobility solutions.
HPC4AI – Turin’s High-Performance Centre for Artificial Intelligence
The project aims to create a HPC, Big Data competence center for open and scalable Artificial Intelligence with applications centred in health, food processing, mechatronics, automotive, aerospace. The infrastructure of the center will be easily accessible through cloud services. To maximize technology transfer, the center will operate by co-designing applications and technological solutions. The center will offer highly specialized support to foster innovation and develop skills in local companies, thus stimulating the expansion of market opportunities.
City building mapping: using public data to find energy efficiency of buildings
Research project with EDISON SpA “City building mapping: using public data to find energy efficiency of buildings”, that focuses on the harvesting of open data for the characterization of the energy efficiency and consumption of both residential and commercial buildings. The project first step focuses on the design and implementation of a big data approach to collect data publicly available about the energy efficiency of buildings, as for example reported in contracts, or in surveys. This would produce a sampled map considering a region, in which only some buildings would have information while for the majority of other buildings no information would be available. In a second step, we will design machine learning approaches to extend the coverage map to those buildings with the same characteristics as those found in the data.
Smart manufacturing and process improvement driven by Machine Learning in Industry 4.0
Research project with Centro Ricerche FIAT (CRF) “Smart manufacturing and process improvement driven by Machine Learning in Industry 4.0”, aiming at the design, implementation and testing of Artificial Intelligence and Machine Learning solutions for predictive maintenance in the press & die shop. The project want to perform the predictive maintenance task by identifying anomalies during the manufacturing process, hence predicting the required maintenance operations. The goal is to reduce the maintenance time and costs, increasing the productivity time and optimizing the production process.
ML4QoE (Machine Learning for QoE): re-enabling QoE for multiparty real time
The web is today a general purpose platform where people navigate on websites, watch TV, listen to music, play games, and participate in multiparty conferences. It has embraced encryption, with HTTPS carrying more than 90% of traffic. This has hampered the ability of in-network devices to classify traffic, and assign a proper QoS/QoE class. In this project, we focus explicitly on the design of novel techniques to re-obtain visibility on application traffic, empowering novel Machine Learning based algorithms to automatically classify traffic stream to applications and to design management policies to improve Quality of Experience for Internet applications.
Enhanced Pore typing Image Analysis and Extended Petrography through Mineralog
The task of analyzing the permeability of soil to natural gas and oil has been studied by experts for many years. This analysis is typically achieved by inspecting samples of terrain, called thin sections. An important procedure of this task is the categorization of pores. This operation is currently manually carried out by domain experts through the analysis of high-resolution images retrieved with Scanning Electron Microscopes (SEM). The aim of the project is to automatize the pore categorization process by means of scalable data mining algorithms.
Recurrent Neural Networks applied to network and service data
Analysis, using data mining techniques (non necessarily only neural networks) of the alarm logs generated in the 3G/4G mobile network. The number of alarms generated in such a system is very high, and not all of them can and should be shown to the operator to decide a possible human intervention for their solution. The system currently used is based on a fixed set of rules to limit the number of raised events. The aim of the project is to discover if data mining techniques can be used to discover relations among the alarms to define new filtering rules, both to reduce the number of raised alarms, and to possibly adapt to different network configurations that might not have been considered in the design of the original rule set.
SHIELD – Securing Against Intruders and Other Threats through a NFV-Enabled Environment
SHIELD is a EU H2020 project proposing a universal solution for dynamically establishing and deploying virtual security infrastructures into ISP and corporate networks. SHIELD builds on the huge momentum of Network Functions Virtualization (NFV) to deploy security appliances into virtual Network Security Functions (vNSFs), instantiated within the network infrastructure, effectively monitoring and filtering network traffic in a distributed manner. Logs and metrics from vNSFs are aggregated into an information-driven Data Analysis and Remediation Engine (DARE), which leverages state-of-the-art big data storage and analytics to predict specific vulnerabilities and attacks. The SmartData@PoliTO center contributes with expertise on network monitoring, attack simulation, big data processing and preparation, as well as machine learning algorithms for the identification of threats and anomalies.
Big Data And Machine Learning Algorithms for Predictive Maintenance on Combustion Engine cars
Automotive combustion engines are controlled by hundreds of signals recorded by the engine control unit. Predict possible faults in such a complex system plays a fundamental role to offer reliable cars. The aim on this project is to analyse actual data collected via onboard car monitoring systems to understand which are the most important signals to predict engine faults, and so model them. The key idea is to apply big data and machine learning techniques to select the most important signals, and to extract models using supervised learning approaches, with the goal of predicting the targeted failures.