What are we doing?
Digital Transformation
With our clients we go along in the digital transformation journey of their business, providing technical capabilities, understanding their business and giving them agile and pragmatic technical solutions that match their requirements and expectations.
Discover how with…

Big Data

Artificial Intelligence



Agile Management

Big Data
Arts, science and technology of massive data analysis

The digitalization and the use of massive data are producing a revolution with a huge impact on social life, which can be perceived daily. Nowadays, we have the ability to store and process more data than ever before in history and it can be transformed, augmented and processed in real-time in order to make decisions and digitally transform businesses.

The large-scale processing of mass data has a bit of data science, a lot of data arts and a bunch of technologies and tools that cooperate within the Big Data ecosystem to transform data into artificial intelligence and actionable knowledge.

Data, knowledge and actions

Combining all the disciplines implicated in data processing, automatic learning, knowledge generation and the coordination of the actions derived from that knowledge, is the “magic” needed to be reached to make the Big Data and Artificial Intelligence projects succeed.


It is necessary a wide range of talent to transform data into actions, from professionals with the adequate technical, mathematical and statistical abilities to engineers with deep knowledge on technologies of Hadoop/ Spark (Hark) ecosystem.

Data Engineering

Data Engineers are the “plumbers” who keep the data pipeline connected, either on the cloud or on-premise, and are devoted to the logistics of data between systems, they deal with quality, cleansing and the organization of information.

They are dedicated to the art of building, developing and maintaining the processes of ingestion, transformation, normalization, modeling  and design of data architecture in a large scale environment. They have to deal with all kinds of raw data, both structured and unstructured or in any format.

Data may contain errors and specific master data related with every source, so multiple transformation tasks are needed to be performed in order to give sense and reliability.
The day by day life of a data engineer is devoted to creating APIs that facilitate data consumption, integragion of data sets from different sources and analyzing how data is used to improve quality, efficiency and processing speed.
Programming with Python, Java or Scala for connecting data from NoSQL and related databases is one of the most important task they perform. They use Hadoop and Spark technologies to improve data consumption processes and work intensively with tools such as NiFi, Sqoop, Flume, Ozie, Kafka, Talend, Pig, Hive, MongoDB, HBase, etc.

Data Science

Data Scientists are dedicated to the more theoretical part of those disciplines included in Data Science that attempt to extract knowledge from data using supervised or unsupervised automated learning.

Data processed and filtered by Data Engineers are delivered to Data Scientists who use them intensively in different analysis algorithms using statistical and machine learning methods to generate knowledge models to be used in the predictive, prescriptive and descriptive analysis as well as in other fields.

Data scientists use languages such as SPSS, R, SAS, and especially Python, Notebooks, PySpark, Anaconda, Pandas, NLTK, etc. to create models

Data Analysis

Once the data has been analyzed by the machine learning models created by the Data Scientists the results are sent to interested parties providing information in real time, daily or monthly to feed the business processes.
Data Analysts are business savvies and are responsible for asking questions and finding answers through data by interpreting the insights of models designed by Data Scientists and feedback models with continuous improvement.
They also perform the visual representation of the data which plays a fundamental role in the communication of the results to the different interested parties and they are in charge of generating a story of the data helping the interpretation and understanding of the insights.
They work with data visualization tools such as Qlik Sense, Power BI or Tableau, creating dashboards, reports and data extractions for business.

Data Governance for Master Data Management

Within a Big Data environment, the data administrator is responsible for supporting the production and data user community. It is responsible for collecting and evaluating problems and defects in the data. They have specific responsibilities within each line of business and manage master data wich can span several business lines. You must communicate problems and other relevant information (e.g. root causes) to the parties involved in solving the problem.
Supervise the processes of ingesting, storing, processing, and transmitting data to internal and external target systems.
They set metrics and data quality requirements, including the definition of values, ranges and parameters that are acceptable for each data element in each use case.
They participate in a continuous and detailed assessment of data quality, identification of anomalies and discrepancies, identification of root cause and implementation of corrective measures.
Establish guidelines and protocols governing data proliferation to ensure that privacy controls are applied in all processes. Define policies and procedures for data access, including authorization criteria based on role and/or profile.
They use a variety of tools ranging from ingesting tools (Nifi, Sqoop, etc.) to data analysis tools. They usually work with Python and Pandas library.


We provide DevOps engineers with high skills coding or scripting, with the ability to make re-engineering in production processes.

They are the bridge between the teams of development and operations, who contributes with their knowledge of the production environment in order to design the testing and staging environments.
They ensure that the development made by developers works right away across a diverse set of operating systems and platforms.
They make the testing, deployment and monitoring in production that have to be done much more frequently in a distributed environment.

Big Data Extreme to Extreme Projects

Our service proposal covers the complete cycle of development of Big Data & Advanced Analytics projects.

Artificial Intelligence
The purpose of AI is to enable machines to see, hear, speak, understand and even begin to reasoning.
We use the latest AI tools and platforms that allow us to select, train and deploy the most suitable machine learning models and we develop the APIs for connecting this all together to create more intelligent, flexible and scalable solutions.
We work with the main AI players

Service Offer

Advanced Analytics

Service Delivery

We provide the Data Engineer, Data Scientists and Data Analyst that perform the full Artificial Intelligence pipeline.

Defines and understand the problem

  • What questions do we ask ourselves?
  • Do we have the right data?
  • What KPIs are the key to understanding model performance and results?

Data preparation

  • Ingest data
  • Refine the data
  • Enrich the data
  • Make sure the quality of the data is good enough
Choosing the right model

  • Choose tools
  • MLaaS or personalized
  • Choose the model
  • Choose the cloud or in-house

Model training

  • Train the model with the data
  •  Iterate on the models to improve them
Model deployment

  • Interpreting the results
  • Manage discrepancies between the technical and business view of results
  •  Re-training the model
The private, public or hybrid cloud model is one of the first and most important decisions to be addressed to take in the digital transformation journey.

We help our clients to determine features like scalability, price, security, GDPR compliance, maintenance and operations requirements for each uses case before taking any decision, and we provide consulting and development services to deploy solutions in the cloud.


It is a model of computing in the cloud that provides a secure and differentiated environment in which only a specific client can operate.

It is recommended when a client wants to store confidential data with the highest security requirements, GDPR compliance and at the same time if it is needed a scalable, secure and reliable solution in cloud.


This is a model that provides virtualized environments, built on shared physical resources that can be accessed through Internet, these are the main differences with a private cloud that restrict, internal and private access to an organization.

Companies can benefit from the public cloud to make some processes more efficient and cheaper such as the storage of non-sensitive content, online services or event agile development environments.


It is a combination of private and public cloud services that is used in some organizations to offer different types and levels of services allowing secure storage of sensitive customer data in the private part of the Cloud, while offering a collaborative and online service accessible through the public cloud.

Service Offer


  • Cost-benefit analysis
  • Backup, storage, and data protection capabilities
  • Flexibility to add resource
  • Ability to scale


  • Designing cloud architectures
  • Defining sensitive assets access
  • Transitioning to the cloud planning
  • Data & processes migration to the cloud

Development and operations

  • Life cycle of application: development, customizing, evolution and maintenance
  • Operational support to bring manageability and improve availability, and scalability of applications.
Cybersecurity has been one of the most important disciplines in information technologies since the adoption of the internet, due to the fact that networks and the machines of any organization or person can be accessible from anywhere on the planet.
Unfortunately, attacks can be extremely profitable, both, the targeted attacks and those carried out massively, so that cybercrime is growing almost exponentially, raising the associated risks for both, businesses and individuals.

Much of the “rise” of cybercrime is due to the emergence of social networks, both personal and professional, group communication tools, etc. that expose society in a way that many people have not assimilated, leaving a way of entry to all kinds of malware.

Likewise, technologies such as Cloud Computing, Big Data or Quantum Computing, which are extremely useful for business purposes, allow cybercriminals to carry out attacks and develop much more powerful and effective malware, so that the complexity to detect, combat or mitigate risks is increasing as the technological revolution continues to advance.

The discipline of cybersecurity has had for decades the main objective of protecting communication networks and their infrastructure, that is, the protection of proxies, firewalls, servers, communications between systems, etc.
The protection of personal equipment has usually been delegated to antivirus and personal firewalls, which have a good functioning except in cases in which the human misuses these tools.
In the field of cybersecurity, at Indizen we have focused on the protection of the most vulnerable element, the human.

We have developed and/or commercialize cybersecurity solutions aimed at controlling, managing and cataloging privileges in web applications, endpoints and mobile devices.

Management of Endpoints privileges:
Traditional solutions are based on the limitation of the functionality where the user has access, with the consequent deterioration of productivity and the increase of the costs involved.
Also, these traditional solutions rely on the use of antivirus, which is valid to detect known malware, not being equally effective for the detection of new malware, as many people could verify with threats such as WannaCry and other Randomware attacks that were introduced in large companies that could brag about a high level of security of their entire infrastructure.
Indizen collaborates with Simarks Software, a Spanish start-up focused on endpoint protection, distribution and evolution of its cybersecurity solutions, where Simarks BestSafe stands out.
BestSafe is a business and domestic solution that allows management of the privileges of users and applications on computers, generating an execution framework that avoids the effects of malware even if it is an unknown one:
Centrally manage user permissions on their computers.
Based on the definition of rules for each user, group or organization.
Create a secure execution framework against unknown malware or ransomware.
You do not need to inspect each file or process to be executed, so it does not slow down the equipment.
It does not replace antivirus but complements them.
It does not reduce employee productivity unlike traditional models based on total reduction of privileges.
Centralized management of permissions in web applications
In most organizations, the task of determining and managing who can do what (authorization) in all of their web applications is delegated to each application, therefore having a decentralized management has, among others, the following disadvantages:
It is difficult to obtain or maintain the catalog of functionalities offered by the applications and authorized users for each of the functionalities.
Change management when adding new policies or when there is a change in the roles and their propagation through the different applications requires a great effort of development, deployment, tests, etc.
Indizen collaborates with Dulin Technologies, a Spanish company specialized in the development of applications, in the development of a centralized management platform of privileges in applications that offer the following functionalities:
A centralized repository of functionalities and the management tools necessary to have the catalog of applications, the functionalities by application as well as the privileges of the users and roles of the active directory in those applications.
A rule engine that allows you to design the authorization rules for each function as well as its application in real time.
A set of services that allows applications to query the privileges of a user or role in real-time.
A framework that facilitates the integration in time of development of the applications in the system of centralized management of privileges.
Agile Management
An agile work methodology is proposed for the planning and execution of Big Data projects.
The Scrum Master will be responsible for ensuring that meetings are held in time (maximum 2 hours of meetings per week) and inform (sprints maximum 4 weeks), stressed the importance of it, and must also ensure that it is done in the established time.

The Product Owner will be in charge of selecting the tasks with the highest priority and communicating it to the rest of the team, these tasks must be the ones that appear on the top of the Product Backlog. The Development Team asks everything needed to turn these user stories into more specific tasks.

Sprint Planning answers the following two questions:

What is perfomed in a Sprint?

The Development Team evaluates your development capability in the Sprint. The Product Owner explains the objective of the iteration, and the items of the Backlog that should be done to achieve the final objective. The whole team works collaboratively to understand the work to be done. During Sprint Planning the Sprint Goal is defined, which is the objective that the Scrum team must achieve for the correct evolution of the project.

How will the chosen job task be performed?

With Sprint Goal and the selected Product Backlog items (Sprint Backlog), the development team decides how to turn these user stories into product increment. At the end of the meeting, the development team should be able to explain to both the Product Owner and the Scrum Master how they are going to work in a self-organized way to develop all the items of the Sprint Backlog, and achieve the target defined in the Sprint Goal.
Contact us
Contact us and our team will get back to you as soon as possible:

(+34) 91 535 85 68


Avda. del Gral. Perón, 36 - 2ªplanta | 28020 Madrid

Calle Hilera, 14 | 29007 Málaga

Privacy Policy