Data science roles bridge statistics, programming, and business intelligence. Demand is strong across all sectors (from fintech to e-commerce), with Python, SQL, and cloud data platforms (Snowflake, BigQuery, dbt) as the core stack.
Primary skills – Knowledge and Hands on Experience with GCP cloud technology including Big Query, Compute Engine, Cloud Storage, Google Kubernetes Engine, Google Cloud deployment manager, IAM, VPC, Cloud SQL, Cloud…
Read full description
Primary skills – Knowledge and Hands on Experience with GCP cloud technology including Big Query, Compute Engine, Cloud Storage, Google Kubernetes Engine, Google Cloud deployment manager, IAM, VPC, Cloud SQL, Cloud Spanner and Cloud Pub/Sub. Should have experience on Google Cloud Dataproc with data structures, data modeling, and software architecture.
The ideal candidate will bring extensive experience across data engineering, data and solution architecture, data integration, master data management, data quality, enterprise data warehousing, and analytics. Establish…
Read full description
The ideal candidate will bring extensive experience across data engineering, data and solution architecture, data integration, master data management, data quality, enterprise data warehousing, and analytics. Establish strong data quality frameworks including validation, monitoring, and automated testing.
Data Analyst (SQL, Excel, Python) Contract Sydney We're looking for a Data Analyst with strong analytical capability and a pragmatic mindset to support critical data initiatives. The Role Analyse large, complex datasets…
Read full description
Data Analyst (SQL, Excel, Python) Contract Sydney We're looking for a Data Analyst with strong analytical capability and a pragmatic mindset to support critical data initiatives. The Role Analyse large, complex datasets using SQL, Excel, and Python to support decision-making Perform data validation, reconciliation, and quality checks across multiple systems Identify data issues, inconsistencies, and gaps — and work through solutions with technology, risk, and business teams Support reporting and insights for senior stakeholders, ensuring data is accurate, traceable, and fit-for-purpose Contribute to data remediation activities where required, including impact analysis, correction logic, and validation of fixes Document findings, assumptions, and outcomes to support governance, audit, and regulatory expectations Skills/Experience Required Strong hands-on experience with SQL (complex queries, joins, reconciliations) Advanced Excel skills (large datasets, formulas, reconciliation models) Working experience with Python for data analysis, validation, or automation Solid understanding of data quality, data integrity, and analytical best practices Ability to explain data findings clearly to both technical and non-technical stakeholders Experience working in large, complex organisations or enterprise programs is highly regarded Bonus points for Previous remediation project experience (regulatory programs & migrations) Exposure to financial services, banking, or regulated environments Experience reconciling data across multiple source systems Familiarity with governan...
• Translate business needs into detailed technical specifications and data models, collaborating with data architects and delivery teams. • Validate data solutions against requirements, ensuring accuracy, usability, and…
Read full description
• Translate business needs into detailed technical specifications and data models, collaborating with data architects and delivery teams. • Validate data solutions against requirements, ensuring accuracy, usability, and compliance with regulatory standards.
You will schedule assignments, monitor, review and report project status regularly in order to manage project risks and ensure successful project delivery and implementation. Project Management fundamentals Project…
Read full description
You will schedule assignments, monitor, review and report project status regularly in order to manage project risks and ensure successful project delivery and implementation. Project Management fundamentals Project Lifecycles on development & maintenance projects, estimation methodologies, quality processes.
Description Expert in Google Cloud Platform (GCP) Skills - Compute, Hosting, Storage, Networking and Security Proficient in. NET Core, .NET Framework, & Web Development Skills Experience migrating from Azure to GCP will…
Read full description
Description Expert in Google Cloud Platform (GCP) Skills - Compute, Hosting, Storage, Networking and Security Proficient in. NET Core, .NET Framework, & Web Development Skills Experience migrating from Azure to GCP will be good to have Proficient in Google Cloud Spanner and .NET Spanner Integration Experience with Redis / distributed caching Strong CI/CD experience (Azure DevOps or equivalent) Experience with Apigee and/or Axway API management platforms Solid understanding of cloud security, networking, and IAM concepts Solid understanding of cloud security, networking, and IAM concepts Understanding of HTML, CSS, JavaScript Experience 16-18 Years Skills Primary Skill: Data Engineering Sub Skill(s): Data Engineering Additional Skill(s): Big Data, GCP-Apps About The Company Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley.
Roles & Responsibilities Key Responsibilities Business Consulting, Problem Formulation & Proposal Development Engage with business teams and leadership to clarify, shape, and structure fuzzy business problems into clear…
Read full description
Roles & Responsibilities Key Responsibilities Business Consulting, Problem Formulation & Proposal Development Engage with business teams and leadership to clarify, shape, and structure fuzzy business problems into clear analytical frameworks. Review work products for statistical rigor and business relevance.
You will collaborate closely with data architects, data engineers, data scientists, and enterprise architects to ensure seamless data flow across systems, domains, and platforms. - Partner with Data & AI leadership to…
Read full description
You will collaborate closely with data architects, data engineers, data scientists, and enterprise architects to ensure seamless data flow across systems, domains, and platforms. - Partner with Data & AI leadership to align integration architecture with business goals and data strategies.
About the Role: We are seeking skilled Data Annotators to work on Indian language voice datasets, ensuring high-quality transcription, annotation, and validation of speech data. Language-Specific Expertise: Work on…
Read full description
About the Role: We are seeking skilled Data Annotators to work on Indian language voice datasets, ensuring high-quality transcription, annotation, and validation of speech data. Language-Specific Expertise: Work on dialect variations, colloquialisms, and regional language nuances.
Extensive experience in designing technology components for data engineering solutions and defining solution architectures and reference architectures leveraging cloud services. Must have experience in application /…
Read full description
Extensive experience in designing technology components for data engineering solutions and defining solution architectures and reference architectures leveraging cloud services. Must have experience in application / data warehouse modernization projects and creating new Big Data & Data Engineering solutions on public cloud platforms.
Responsibilities You will collaborate with engineers, data scientists, subject matter experts, and customers in the full-stack implementation of data products for mortgage capital markets applications. You will help…
Read full description
Responsibilities You will collaborate with engineers, data scientists, subject matter experts, and customers in the full-stack implementation of data products for mortgage capital markets applications. You will help define and maintain derived data layers that support a collection of related analytics products.
Your Role And Responsibilities As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. Coordinate data access and security to enable data scientists and analysts to easily access to…
Read full description
Your Role And Responsibilities As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.
Salary tags blend employer provided ranges with Catalitium estimates. We mark ranges with Est. labels, note any missing data, and never inflate compensation to boost clicks.
Currency harmonized to USD/EUR/GBP/CHF to avoid surprises.
Outliers are reviewed manually before they appear on a card.
Sponsored employers follow the same disclosure and pay rules.
Yes. Remote-friendly AI and ML roles in the EU have grown over 30% year-on-year. Germany, France, the Netherlands, and Spain lead in volume. Use the AI and EU filters together to surface them quickly, and check the salary estimate badge to ensure the range meets your expectations.
How fresh are the job postings?
Listings are refreshed continuously from employer feeds and normalised daily. Each card shows a posted date pill so you can see exactly how old a listing is. Jobs posted within the last 7 days receive a green New badge. Listings older than 30 days receive a May be filled warning.
Do roles include salary estimates?
Yes. Most listings show an Est. salary pill derived from Catalitium's location-based salary database, blended with any employer-disclosed range. Senior and lead roles receive an automatic seniority uplift. If a salary range is genuinely unknown we leave the field blank rather than show a misleading estimate.
What is a ghost job and how do I spot one?
A ghost job is a listing that has been live for 30+ days and is likely already filled, on hold, or was never a real opening. Research suggests up to 40% of active listings at any time are ghost jobs. Catalitium flags every listing older than 30 days with a triangle May be filled badge so you can prioritise your energy on fresh openings.
What are the highest-paying tech roles right now?
AI and ML engineer roles currently command the highest median salaries on Catalitium, around $150k–$200k USD in the US and EUR 100k–EUR 160k in Europe. Principal and Staff Engineer roles come close, followed by senior full-stack and cloud infrastructure engineers. Use the >100k filter to see only high-compensation listings.
How do I negotiate a higher salary offer?
Reference Catalitium's salary data when negotiating: show the employer the market range for your role and region. Studies show engineers who negotiate receive 10–20% more than the initial offer on average. If base salary is fixed, push on equity, signing bonus, remote allowance, and learning budget. See our Salary Negotiation Guide in the Resources section.
Which European cities pay the most for tech?
Zurich and Geneva (Switzerland) consistently top European tech salaries, followed by London, Amsterdam, Berlin, Paris, and Stockholm. Swiss salaries are typically quoted in CHF and translate to EUR 100k–160k for mid-senior roles. London follows at GBP 70k–110k. Berlin and Amsterdam are competitive at EUR 70k–100k for comparable experience levels.
Can I track my job applications on Catalitium?
Yes. Our free Application Tracker lets you move roles through a Kanban pipeline: Applied, Phone Screen, Interview, Offer, and Closed. It requires no account and stores everything privately in your browser. Hit the Track button on any job card to add it. You can also export your full pipeline as a CSV.
How does Catalitium differ from LinkedIn Jobs?
LinkedIn optimises for engagement and premium upgrades. Catalitium is built exclusively for tech candidates who want signal over noise: every listing shows salary estimates, ghost jobs are flagged, AI-powered summaries save you time reading descriptions, and the application tracker replaces the black-hole Easy Apply experience. No premium paywall, no recruiter spam.
Can I filter to remote-only jobs?
Yes. Choose Remote in the location/country field or tap a Remote shortcut chip. Results are limited to roles that advertise remote or hybrid where the listing text supports it, and remote-friendly rows show a Remote badge.
Which tech stacks are most in demand?
Across Catalitium tech listings, Python, TypeScript/JavaScript, Go, Java, and cloud platforms (AWS, GCP, Azure, Kubernetes) recur most often; AI and data roles add PyTorch, TensorFlow, and LLM tooling. Title and AI summary chips reflect the employer's stated stack.
Which Swiss cities pay the most for software roles?
Zurich and Geneva typically lead Switzerland for software, data, and platform engineering compensation; smaller hubs follow at a discount. Swiss ranges often sit above neighbouring EU markets for comparable seniority—check Est. salary on each card when you filter by Switzerland.
Save this search and get a weekly digest of top matches.
Includes salary signal. No spam.