Enjoy Intelek Compute this month from €30 per month.
Data processing software
Orchestrator-as-a-service for running data processing and AI algorithms without worrying about infrastructure.
Schedule the execution of your analytical processes
Intelek Compute allows you to run your analytical processes (Python, SQL and R) in the cloud, without the need to provision servers or manage infrastructure. Minimize costs by paying only for the actual computing capacity used.
Simplicity
No complicated features you'll never use. Deploy and monitor your process in minutes.
Standard languages
Support for the main languages used by Data Scientists (Python, R, SQL, notebooks…).
GIT Repositories
Sync your algorithms from your preferred GIT environment. Work and evolve your code from your favorite IDE.
Scalability
Use the type of machine you need for your process. Pay only for the time you use.
Multiple triggers
Schedule execution with crontabs, run the process via API or chain processes together.
Security
Being able to route traffic from fixed IPs, establish IP2sec VPN tunnels, etc.
How data processing software works?
Connect with the environment in which you develop your algorithms
Working with GIT is not a mandatory step to work with Intelek Compute, but it is a general good practice that we support.
When working with a GIT environment, Intelek Compute will always use the latest validated version of the code, without having to transcribe anything on your side.
Support for the main GIT providers
Possibility of modifying the branch from which you will execute the process, facilitating testing before deploying.
Work with your favorite IDE / Code Editor (VSC, PyCharm, JupyterHub, …). Without the need to move or translate any development.
CI/CD of analytical processes. When integrating into the “master” branch, the next execution will take the new version.
Register algorithms
Register the algorithms you want to run from Intelek Compute. Select them from your GIT environment or upload them with drag & drop.
Specify all the input parameters your algorithm requires (variables, credentials, etc.).
Support for algorithms in SQL, Python, Jupyter and R.
Support for defining algorithms as individual scripts or as folders with several files.
Possibility to specify static or dynamic variables to inject into the code.
Possibility to specify “secrets” (private and encrypted variables) to inject into the code.
Defining a task to execute 1 or more algorithms
Once the algorithms have been registered, you can create a workflow to execute 1 or N algorithms. This workflow is the process that will be executed with the frequency or triggers that you configure.
Variety of triggers: From manual synchronization, periodic (crontab), via API or at the end of another execution flow or Intelek Collector process.
Selection of the GIT branch from which the process will be executed.
Specification of the value of the variables and secrets to be injected into each algorithm.
Monitoring execution flows
Control the executions of all your processes at all times:
Monitor the executions you have in progress in real time.
Analyze the execution history and consult the messages that have been printed on the server.
Detect any incident (failed credentials, format changes, etc.) in a simple and transparent way.
Subscribe to notifications about the quality of the execution of a process.
Data Processing Software Features
Discover what you can do with Intelek Compute!
Supports GIT repositories
Sync your code directly from a GIT repository.
Parameterization of algorithms
Inject variables or credentials into your algorithms.
Execution flows for multiple algorithms
Defines execution flows that process a chain of algorithms.
Execution modes
Schedule periodic executions of your algorithms, either one-off or after another process.
Executing processes via API
Execute a flow or algorithm via API and consume its response.
Process monitoring
View the executions in progress and the execution history at any time.
Subscribe to notifications
Receive notifications about executions or errors in them.
Variety of machines
Select the machine with the appropriate power for your process. Pay only for use.
Security
Restrict access only from your trusted networks. Control the fixed IP from which all processes will run. Establish Ip2sec VPN tunnels.
Intelek Compute
Find the plan that fits your needs
Compare our plans and choose the one that best suits your organization.
Other modules that may interest you
Designed specifically so that companies of any size can successfully establish their data strategy.
Data Governance
Control data access by granting your users policies with table access restrictions.
Data Manager
Navigate comfortably through your information system between different tables of the corporate model.
Data API
Access your data and run your algorithms via API, from any computer system.
Data analytics software
Advanced analytics and AI in the cloud
Run analytics in the cloud without provisioning servers or managing infrastructure. Plus, you'll minimize costs by paying only for the actual computing capacity used.
Infrastructure costs pay-per-use
Automatic and unlimited scalability
User interface for data scientists
The best tool for your data scientists
Your data scientists can focus on doing their job, being able to deploy their solutions in a productive cloud environment completely autonomously, without requiring cloud architecture or data engineering skills, and in the industry standard code languages: SQL, Python, R, or Docker containers.
Deployment of algorithms in a cloud environment
Intelek compute module allows you to maintain a registry of algorithms by linking to git code repositories. Once your algorithms have been defined and the parameters they require have been declared, you can create execution tasks, defining the branch of the repository to use, the algorithms to execute, the appropriate parameter values and the execution time using crontab-type rules.
Algorithm registry
Upload algorithms manually or select them by browsing git repositories:
Github, GitLab, Bitbucket, or CodeCommit.
SQL scripts, Python, or Jupyter notebooks.
Algorithm parameterization
Declare the parameters or variables of your algorithms to be able to manage execution tasks with different input values.
Parameterize SQL, Python, or Jupyter scripts.
Define the set or range of valid values of the parameters.
Task definition
Create your execution tasks based on the algorithm registry.
Choose the algorithms and the order of execution.
If you have parameterized algorithms, define the parameter values to be injected.
Execution system
You can run your tasks manually at any time or in an automated way:
Periodic execution using crontab rules.
Execution on demand via API from your IT systems.
Gobernanza
User permissions
User grants for allowing the deployment of algorithms.
Resource allocation
Computing resources limits at user or group level.
User groups
Grant task execution rights to groups of users.
Monitoring and execution statistics of tasks and algorithms:
Real-time inishgts of the status of the executions in progress.
Tasks execution history
Errors, type and detail of errors
Duration of executions
Access to the logs issued by the algorithms.
Global CPU-usage statistics of the computing module.
Encuentra tu plan
Desde
30€
/mes
Hasta 10 horas de cálculo por mes
Funciones principales
- 1 usuario desarrollador y 2 usuarios espectadores
- Duración de la tarea: hasta 10 minutos
- Programación de trabajos: hasta cada 4 horas o activada por el recolector
- Soporte por correo electrónico: Respuesta hasta 2 días hábiles
Desde
115€
/mes
Horas por tamaño de máquina
Small
0,12 €/h
Medium
0,48 €/h
Large
0,96 €/h
Limitado a 120 horas por mes
Funciones principales
- Todo el paquete Starter
- Usuarios ilimitados
- Tareas en ejecución simultáneos
- Programación de trabajos: hasta cada 60 minutos o activada por el recolector
- Ejecución de tareas desde una IP fija (NAT)
- Soporte por correo electrónico: respuesta el mismo día
Add-ons: API Execution
25 € por mes
2 € por URL por mes
Desde
290€
/mes
Horas por tamaño de máquina
Small
0,08 €/h
Medium
0,32 €/h
Large
0,64 €/h
Limitado a 240 horas por mes
Funciones principales
- Todo el paquete Standard
- Contenedor Docker
- Programación de trabajos: hasta cada 10 minutos o activada por el recolector
- Túneles VPN con red de cliente
- Soporte por correo electrónico: Respuesta hasta 2 horas hábiles
Add-ons
25 € por mes
2 € por URL por mes
Plan customizado
Si tienes necesidades espaciales. Contáctanos para que podamos crear un plan personalizado.
- Rows ilimitadas
- Pipelines premium ilimitados
- Incorporación con ingeniero dedicado + 1 consultor
- Soporte por correo electrónico: Respuesta hasta 2 días hábiles
Compara los planes
Te ayudamos a identificar, valorar y priorizar en el tiempo las distintas oportunidades analíticas de tu organización.
Usage | |||
---|---|---|---|
Compute Hours | Up to 10 hours per month | Up to 10 hours per month | Limited to 240 hours per month |
Shared Compute: Machine Selection | |||
Dedicated Compute 1 | |||
Concurrent Running Jobs | |||
Developer Licences | 1 | Unlimited Add-Ons | Unlimited Add-Ons |
Viewers Licences | 2 | Unlimited Add-Ons | Unlimited Add-Ons |
Support | |||
Onboarding | Self-service | Ilimitados | Ilimitados |
Canal de soporte | |||
Tiempos de respuesta | Entre 2 días laborables | Respuesta en el mismo días | Respuesta antes de 2 horas |
Algorithms | |||
SQL Scrips | |||
Python / R: Scrips, Folders & Packages | |||
Docker Container | |||
User Parameters / Variables | |||
Secrets Catalog | |||
Code Sources | |||
File / Folder | |||
Git: Commit, GitHub & GitLab | |||
Orchestration: Job Definition | |||
Task definition with multiple algorithms | |||
Task duration | Once every 60 minutes | Once every 60 minutes | Once every 60 minutes |
Algorithm timeout definition | |||
Test & Control executions | |||
Orchestration: Job Triggering | |||
Manual | |||
Job scheduling / Crontab | Up to 10 minutes | Up to 60 minutes | Unlimited |
API: Selection of entities to be sync | |||
Event-based | |||
API | |||
Monitoring & Alerts | |||
Individual tasks monitoring | |||
Global monitoring dashboard: All tasks | |||
User access to server logs & prints | |||
Subscription to tasks' notifications | |||
Governance | |||
Create and Manage Groups | |||
Role-based access | |||
Security | |||
Data Compliance | |||
Task execution from a fixed IP: NAT | |||
VPN Tunnels with client network |