Find your next tech and IT Job or contract Data analysis
What you need to know about Data analysis
Data analysis is the process of examining, transforming and interpreting raw data to extract useful information, identify trends and support decision-making. This field is central to many sectors, such as finance, marketing, healthcare or sciences. The data analysis process includes several steps: collecting data from various sources, cleaning it to ensure quality, exploring it using statistical tools, and visualizing it to facilitate understanding. Popular tools for data analysis include Excel, Python (with libraries like Pandas and NumPy), R, and software like Tableau or Power BI. Thanks to data analysis, organizations can better understand their environment, anticipate behaviors, and optimize their operations. It's a key skill in the data era, essential for data scientists, business analysts and decision-makers.

Contractor job
Data Scientist / Data Analyst expert Vertex et GCP
L'équipe IA SFR Analytics se dote d'un nouvel outil d'entraînement, de serving et de monitoring de ses modèles. Cet outil, nommé "Plateforme MLOps" en interne, doit être livré en juin et s'appuyer sur un panel de services proposés à la fois par GCP et par l'IT SFR. Plus précisément, les technologies utilisées par la plateforme seront : - GCP Workstations : l'environnement de développement - notebooks/Rstudio Server/codeOSS Server - GCP Bigquery - GCP GCS - GCP Vertex - SFR Gitlab - SFR Harbor (container registry) - SFR Nexus (package manager) - SFR Airflow (ordonnanceur) La plateforme MLOps comprendra deux modes d'utilisation : - Portage d'applications existantes - MLOps mainstream GCP La mission actuelle vise à : - recetter la plateforme dans son volet de portage - démarrer la migration des projets de Data Science SFR Analytics sur cette plateforme de portage A date, l'équipe administre trois serveurs physiques on-prem et y fait tourner l'ensemble de ses projets de data science. Les technos utilisées pour chaque étape du workflow de ML sont détaillées ci-dessous : - Analyse exploratoire / entraînement de modèles : - Le data scientist démarre un container docker sur l'un des serveurs linux. - Ce container expose un Rstudio server (équivalent notebook) auquel le data scientist se connecte. - A partir de cet environnement de travail, le data scientist peut : - installer de manière persistante les packages R/Python dont il a besoin pour son projet - se connecter à notre DWH Bigquery pour requêter, récupérer ou y remonter des données - exploiter de manière non capée les cpus et la ram de la machine hôte - entraîner des modèles - analyser leur performance - sauvegarder sur disque persistant le ou les modèles retenus ainsi que la base d'apprentissage et les fichiers de QOD associés (distributions des variables de la base d'apprentissage) - préparer le ou les scripts d'inférence du modèle, qui, au sein d'un container similaire, loaderont le modèle sauvegardé, réaliseront l'inférence en batch, et remonteront les outputs du modèle (probas et métriques de QOD des variables d'entrée notamment) sur Bigquery et/ou sur fichiers locaux - pusher son code sur un serveur Gitlab on-prem pour partage et versioning - Inférence du modèle : - Un container identique au container d'apprentissage mais dépourvu de Rstudio server est démarré de manière automatique par un worker Airflow afin de réaliser un batch d'inférence. Les dossiers contenant les packages, les scripts et les artefacts nécessaires à l'inférence sont montés au run dans le container. - Le container exporte ses résultats (probas et métriques de QOD des variables d'entrée notamment) sur BigQuery et/ou sur disque. - Monitoring : - Une application R shiny portée par un shiny-server accède aux fichiers locaux et/ou aux données remontées sur Bigquery par les jobs d'inférence et affiche : - le suivi des distributions des inputs du modèle - l'évolution des performances à froid du modèle (dans le cas des modèles supervisés et une fois que l'on dispose de suffisamment de recul temporel) Dans le fonctionnement en mode "portage", les modifications sont les suivantes : - Analyse exploratoire / entraînement de modèles : - le container de développement / exploration / training ne tourne plus sur nos machine on-premise mais sur GCP workstations - il ne sert plus uniquement une interface Rstudio Server mais également un jupyterlab et un code-oss (au choix du data scientist) - les artefacts, dont les binaires de modèles entraînés, les packages installés et les autres fichiers créés depuis notre IDE web ne sont plus stockés sur nos serveurs mais sur un bucket GCS - le lien vers Gitlab demeure fonctionnel pour le versioning des codes, mais Gitlab devient également responsable du déploiement du traitement d'inférence : - dans un projet GCP "lab" dédié au prototypage, accessible depuis les workstations et depuis la chaîne de ci Gitlab. - dans un projet GCP "run" dédié à la production, accessible uniquement par la ci/cd Gitlab. - Inférence du modèle : - le container exécutant le traitement batch reste démarré par un appel du serveur Airflow, mais le service Airflow SFR Analytics est remplacé par le service Airflow de l'IT SFR - le container n'est donc plus démarré sur nos serveurs mais sur un Cloud Run en mode job - ce Cloud Run peut être rattaché aux environnements "lab" ou "run" - Monitoring : - l'application shiny de monitoring n'est plus servie par un shiny-server on prem mais est conteneurisée et portée par un Cloud Run tournant en mode service - l'application shiny de monitoring ne lit plus ses données depuis les disques de nos serveurs mais depuis le dataset Bigquery et/ou le bucket GCS où elles sont stockées - de même, le Cloud Run exécutant le shiny peut être rattaché aux environnements "lab" ou "run" Comme dit en introduction, la mission consiste à : - recetter le fonctionnement de la plateforme MLOps en mode portage : fonctionnalités détaillées ci-dessous - démarrer la migration des projets de data science SFR Analytics sur cette plateforme de portage . Par migration des projets de data science existants, on entend le portage des étapes - d'analyse - d'entraînement/test/validation des modèles - de mise en production - et de monitoring des modèles ces deux objectifs peuvent être menés conjointement, la migration des use-cases existants représentant une opportunité de recette en elle-même. La recette inclut notamment les points suivants : - recette de la workstation : - de ses configurations et containers préparamétrés, qui doivent notamment : - proposer : - un ide fonctionnel : Rstudio server, jupyterlab ou code-oss au choix du datascientist - tout le socle permettant l'utilisation des binaires métiers (Python, R, Java, git) ainsi que l'installation / compilation des packages requis par le projet - être démarrés avec : - un montage fuse d'un ou plusieurs buckets GCS en guise de stockage persistant non rattaché à la VM sous-jacente - une authentification GCP héritée de la connexion aux workstations via la console GCP - être connectés à : - Bigquery - GCS - Cloud Run - Gitlab - Harbor - Nexus - de la possibilité de proposer des merge requests sur le repo Gitlab des images docker accessibles par la workstation - ainsi que sur le repo des configuration des clusters de workstations (terraforms) - recette des templates de ci Gitlab de la plateforme, qui doivent notamment permettre de : - builder les images docker d'inférence et de monitoring - créer / modifier les dags exécutés par le serveur Airflow - recette du fonctionnement d'Harbor (container registry) : - check que GCP workstations et Cloud Run se connectent bien à Harbor - check que Gitlab peut pusher les images qu'il a buildées sur notre repo Harbor - recette du fonctionnement de Nexus (package manager) : - check du bon fonctionnement en tant que proxy des principaux repos publics (conda, pypi, cran, posit package manager, huggingface notammment), tant en lab qu'en run - recette du fonctionnement de Airflow (sur l'environnement de run) : - check de la bonne exécution des dags - check de la bonne récupération des logs de tâches GCP dans l'UI Airflow indispensable: '- bonne maîtrise du workflow des projets de machine learning - maîtrise de git et de la chaîne de ci/cd gitlab - maîtrise de docker - maîtrise de l'écosystème GCP, et particulièrement des services mentionnés dans la section "cadre et environnement" (les certifications GCP seront un plus) - connaissance du langage R -expérience de développement de modèles de machine learning Souhaite 'Datascience : analyses descriptives multi variées - recommandations métier issues de ces analyse

Contractor job
Data Steward expert SQL/GCP (Google Cloud Platform)
Cette mission est en Mi-Temps Le Data Steward aura pour mission principales d'assurer un nettoyage massif des données et une stabilisation des flux. Identifier, analyser et corriger les problèmes de qualité des données (flux bloqués, incohérences entre plateformes). Exécuter des requêtes complexes (SQL) sur Google Cloud Platform (GCP) afin d’investiguer et documenter les anomalies. Collaborer avec les équipes applicatives (SAP, Salesforce) et les métiers pour comprendre les besoins et assurer la cohérence des données. Fournir une documentation claire et transférable au support pour assurer la pérennité des correctifs. Être l’interface entre les équipes techniques et métiers, avec une forte dimension analytique.

Job Vacancy
Data Analyst
Data Analyst - Holborn London Our Client is looking to recruit a Data Analyst with at least 5 years proven experience. As a Data Analyst you will ensure that all data is processed and quality assured so that it can be analysed through appropriate systems with a view to the production of reports that can assist the Client services and other stakeholders make effective decisions and support the assessment of indicators and other business intelligence processes.. Key responsibilities: Day to day implementation and overview of the business's data strategy to ensure that organisational data is systematised with a view to optimising business intelligence and decision-making. Clear knowledge of all the data sources within the organisation, including the CRM and other information systems, and ability to work on those sources either individually, together or identify new sources that can inform effective operation. Knowledge of all relevant procedures to ensure that optimal data is used, including the cleaning, analysis and processing of data. Putting in place effective and robust systems of quality to ensure the highest integrity of data. Working with the IT department on data analysis, including developing key formulas, working with APIs and data mining, particularly through appropriate relational and other data bases, with a view to extracting the information that is needed for the growth of the organisation. Enabling the effective and accurate production of all reports, visuals, dashboards and other outputs to allow different parts of the organisation to make better individual and collective decisions. Facilitating the transformation of data to information and knowledge, and helping the different parts of the organisation to share such information and knowledge so as to maximise collaboration and the aims of the organisation. Supporting business intelligence so that the decisions are underpinned by key, effective and relevant data. Enabling the organisation to measure key performance and other indicators and score carding, and contributing to the development of metric and indicator-based systems that allow for integration and a system of continuous improvement. Collaborating with all departments including the IT department, education and training, marketing and policy and research on data usage. Minimising all data risks, security breaches, financial and other risks and working with relevant services to ensure that data is handled in a way that is compliant with relevant legislation and other frameworks. Furthering a data culture that enables the organisation to meet the needs of its stakeholders, both nationally and internationally, including its members. The following skills are Essential: Relevant higher education qualification in data analysis, including programming languages such as Python and SQL. Substantial experience working in data analysis within an organisational setting. Excellent problem solving, analytical and business intelligence skills, especially working with large datasets. Excellent written and verbal communication skills in the English language. Ability to manage multiple projects adhering to logistics, timescales and deadlines. IT literate and experience of using MS Dynamics and Office (including Word, Excel and Outlook). Experience of using visualisation tools such as PowerBI. The Client is based in Holborn London. The salary for this role will be in the range £40K - £45K. Do send your CV to us in Word format and include your salary availability.

Job Vacancy
Database Developer with TeraData PL SQL - Coventry
Database Developer with TeraData - Coventry Our Client is looking to recruit a Tera Data Developer with at least 5 years hands-on experience in data analysis, data modeling & SQL writing in Teradata Environment. Must have strong knowledge in Teradata Architecture. (Must have knowledge on Journal, RAID, CLIQUE,FALLBACK mechanism) Good understanding and working knowledge of ALL SQL functions, Join strategies (HASH, MARGE etc.) Understanding and working knowledge of all System tables such as DBQL, PDCRINFO etc Hands on knowledge on Teradata PL/SQL, Store Procedure, Ref Cursor etc With very good hands on knowledge on Teradata SQL, BTEQ, Fastload, Multiload, TPUMP, TPT skill Complete understanding of Performance Tuning technique, working with SKEWNESS, SPACE MANAGING, LOCKING Understanding of Software Development Life Cycle (SDLC) & Data Warehousing Concepts Should have knowledge on Data Profiling activities Should have knowledge on Teradata Viewpoint portlets, alert mechanism etc With excellent analytical and problem-solving abilities, with quick adaptation to new technologies, methodologies, and system Good individual contributor as well as a team player with ability to build strong partner confidence Excellent communication skills and judgment, ability to manage multiple tasks, and work towards a deadline in a fast paced, aggressive development schedule Ability to work in any Shift (24x7) Tera Data, DWBi Concepts, Data Modeling, Development, SQL & analytics. Salary for this role is negotiable but will be in the range £45K to £60K. Please send your CV to us in Word format.
Contractor job
Senior Talent Manager
The Senior Manager, Talent plays a crucial role in shaping the Talent Strategy, designing and owning the Talent Development and Talent Management portfolio. Leadership Development: Helps employees gain and develop leadership competencies and prepares them for management and leadership roles within an organisation. Talent Review: Plans, prepares and facilitates a talent review program in which leaders review employees' strengths, development areas, and potential career trajectories. Talent Mobility: Monitors and improves mobility processes within an organisation, while empowering employees' development and improving satisfaction. Commercial acumen: Can clearly articulate the current business performance and leverage the people levers to enhance performance or productivity Business case development: Is able to develop a clear business case that outlines the context, insight, the proposal and a method to evaluate the impact of the intervention Storytelling: Leveraging data and insights to drive business outcomes. Data Analysis: Collects and interprets data in order to uncover patterns and trends. Agile Methodologies: Manages projects by dividing tasks into short phases of work (known as sprints) and frequently reassess and adapt plans. Project/Programme Management: Handles a project or portfolio of projects as they progress through the typical stages of the project lifecycle including initiation, planning, execution and closure. Stakeholder Management: Organises, monitors and improves relationships with stakeholders. Relationship Building: Connects with others and forming positive relationships. Data Analysis: Collects and interprets data in order to uncover patterns and trends. Continuous Improvement: Continuously thrives to improve products, services, or processes. Risk Management: Identifies, evaluates and manages risks by developing and implementing strategies, frameworks, policies, procedures and practices. Please send in your latest CV LA International is a HMG approved ICT Recruitment and Project Solutions Consultancy, operating globally from the largest single site in the UK as an IT Consultancy or as an Employment Business & Agency depending upon the precise nature of the work, for security cleared jobs or non-clearance vacancies, LA International welcome applications from all sections of the community and from people with diverse experience and backgrounds. Award Winning LA International, winner of the Recruiter Awards for Excellence, Best IT Recruitment Company, Best Public Sector Recruitment Company and overall Gold Award winner, has now secured the most prestigious business award that any business can receive, The Queens Award for Enterprise: International Trade, for the second consecutive period.
Job Vacancy
Network Performance Manager
Job Profile In this critical role as Network Performance Manager, you will lead performance strategy and continuous improvement across EV charging network operations activity. Your primary goal is to drive year-on-year improvement by leveraging data analytics, digital tools, and stakeholder collaboration to enhance operational efficiency and service delivery. You will work with multiple service delivery contractors to optimise performance assuring our network availability. Key Responsibilities Reporting to the Head of Network Operations you will be responsible for: Supporting the Head of Network Operations in the development of the future operating model for the management of the network. Owning and managing supplier performance frameworks and ensuring that KPIs and SLAs related to EV charging infrastructure are met or exceeded. Working collaboratively with suppliers and internal teams to ensure seamless EV charging service delivery and rapid fault resolution. Leading regular supplier performance reviews, ensuring issues are addressed and continuous improvement plans are implemented. Supporting onboarding of new EV charging suppliers with clear expectations and performance metrics. Collaborating with procurement, commercial and operations teams to monitor, report, and improve supplier delivery and compliance. Using data-driven insights and performance dashboards to identify trends, root causes of failure, and areas for supplier improvement working with the Network Manager to implement required remedial activities. Driving initiatives to improve contractors' reliability and efficiency of the charging network Developing and embedding tools, processes, and training to standardise supplier performance management across regions. Supporting product development and operational delivery of EV charging infrastructure and services. Tracking progress of operational development plans and ensure long-term sustainability. Ensuring contractors are adhering to regulatory safety and environmental standards and achieving compliance with the Public Charge Point Regulations (PCPR). Identifying risks and ensuring they are managed appropriately. Experience: Proven experience managing performance at scale, ideally in energy, infrastructure, or EV sectors. Strong analytical, strategic thinking, and data-driven decision-making skills. Experience leading cross-functional teams and delivering results in a fast-paced environment. IT-literate, with strong skills in data analysis and reporting tools. Passion for innovation and operational excellence. Proven ability to manage KPIs, analyse performance data, and influence outcomes with suppliers. Excellent communication, negotiation, and stakeholder management skills. Strategic mindset with a hands-on approach to operational problem-solving. Skilled in reporting and analytics tools for monitoring supplier performance. Experience delivering results across cross-functional teams and complex supply chains. Understanding of network protocols, hardware and software OCPP and ability to understand ITIL fundamentals of service management. Providing excellent customer service to ensure that our customers love every electric journey in line with our strategic vision. Proficient in the use of Microsoft Excel, Word, Power BI & PowerPoint. Strong analytical, organisational, and multitasking skills. Ability to identify opportunities for improvement and work on implementation plans and business cases.
Submit your CV
-
Manage your visibility
Salary, remote work... Define all the criteria that are important to you.
-
Get discovered
Recruiters come directly to look for their future hires in our CV library.
-
Join a community
Connect with like-minded tech and IT professionals on a daily basis through our forum.
Latest forum topics
- I NEED A HACKER TO ASSIST ME IN RECOVERING MY LOST BITCOIN HIRE TECHY FORCE CYBER RETRIEVAL
- recover lost cryptocurrency—and how to keep it safe
- What to Do If Your Laptop Loads Too Slowly
- Why Should You Use a Travel Ads Network in 2025?
- AI Blockchain Projects - Its Reward and the Future
- Help Choosing a Web Design Company in Dubai
Jobs by city
Jobs by country
Contract roles and jobs for Data analysis
Data Scientist
The Data Scientist analyzes complex data to extract trends, design predictive models, and provide actionable insights for strategic decision-making.
Explore contract role and jobs for Data Scientist .
Find out the jobsData Analyst
The Data Analyst conducts detailed data analysis to answer specific questions, create reports, and visualize results to support operational decisions.
Explore contract role and jobs for Data Analyst .
Find out the jobsBusiness Intelligence Consultant (PowerBI, SAS, Tableau...)
The Business Intelligence Consultant (PowerBI, SAS, Tableau...) uses analysis and visualization tools to transform raw data into clear, actionable information for business teams.
Explore contract role and jobs for Business Intelligence Consultant (PowerBI, SAS, Tableau...) .
Find out the jobsData Developer (BI / Big Data / Data Engineer)
The Data Developer (BI / Big Data / Data Engineer) designs data pipelines, cleans and prepares datasets for advanced analytics, and develops solutions tailored to data needs.
Explore contract role and jobs for Data Developer (BI / Big Data / Data Engineer) .
Find out the jobs