top of page


A N A L Y S I S  O F  D A T A

  • The Network Analytics department bases all its activities on a careful, robust and scalable data design. The quality, cleanliness and organization of the data is the basis of all our projects.


  • LCC has in the analytics department with engineers with extensive experience in the telecommunications sector trained in the latest data processing technologies and analysis tools.


  • Analytics projects, regardless of their size, follow the following phases:

    • Data audit

    • Cleaning and Structuring of the inputs

    • Study and Analysis of structures and relationships

    • Design of a relational analytical model of the data set

    • Analytical correlations and interrelationships of the data

    • Visualization of the conclusions and presentation of final studies


  • To carry out all the stages described, the department has the latest technologies for data processing and visualization:

    • Audit and Data Processing, depending on the type and needs we have the following stack:

      • Python

      • Talend

      • Pentaho

      • Java

      • SQL

    • Visualization, to adapt to the needs of each project we have commercial analytics tools or our own developments on the web:

      • Qlik

      • PowerBI

      • Own development


  • All projects are managed with agile methodologies to ensure the adaptation and precision that each project needs

Image by Franki Chamaki

M A C H I N E  L E A R N I N G

  • LCC uses all the power of the most current Machine Learning algorithms to be able to respond to the advanced analysis needs that our clients need.


  • Thanks to the synergies with the rest of the company's departments and the accumulated experience of the Network Analytics department itself, we can start from a precise knowledge of the needs of each of the projects required. And in this way trace an efficient and adjusted process, interconnecting the algorithms that lead us to the best possible decision.


  • As the main core of the department is the treatment, care and knowledge of the data, large sets of independent, significant and treated variables can be generated very efficiently to ensure convergence and reliability.


  • The human team dedicated to Machine Learning projects is in constant training and evolution thanks to the accumulated experience.


  • We make use of a set of Python libraries, in continuous growth according to needs:

    • TensorFlow

    • Keras

    • Prophet

    • Joblib

    • Lightgbm

    • Scikit-learn

    • Xgboost

    • Minisom

    • Keras

T O O L S  D E V E L O P M E N T

&  R P A

  • In line with the philosophy of the department, all the tools have their core in the design of the BackEnd data model appropriate to the need and thinking about the scalability and maintainability of each of the projects.


  • The tools are developed mainly in a web interface, with the highest security standards, to facilitate access to our clients.


  • LCC's own IT infrastructure can be offered to host the tools as well as configure its own infrastructure for our clients where they can maintain the solutions created.


  • Thanks to in-depth knowledge of data processing and structuring, highly efficient solutions are developed with functionalities and visualizations as complex as necessary.


  • We combine experience in machine learning and geoanalysis to provide our solutions with special functionalities that can make our clients' day-to-day operations more efficient.


  • Both for integrating into the data ingestion of tools and for dedicated projects, LCC provides a battery of scripts and robots to automate repetitive processes.

bottom of page