PowerBI Engineer

Glasgow
1 week ago
Applications closed

Related Jobs

View all jobs

Senior Data Engineer/ PowerBI

Data Engineer - Junior

Data Engineer

Analytics Solution Engineer

IT Manager

IT Manager

Power BI Report Engineer (Azure / Databricks)

Glasgow based only | 4 days onsite | No visa restrictions please

Are you a Power BI specialist who loves clean, governed data and high-performance semantic models?
Do you want to work with a business that's rebuilding its entire BI estate the right way-proper Lakehouse architecture, curated Gold tables, PBIP, Git, and end-to-end governance?
If so, this is one of the most modern, forward-thinking Power BI engineering roles in Scotland.
Our Glasgow-based client is transforming its reporting platform using Azure + Databricks, with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3, and semantic modelling is treated as a first-class engineering discipline.
This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and Customer Care-turning trusted Lakehouse data into insights the business relies on every day.

? Why This Role Exists

To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights.
You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem.

? What You'll Do

Semantic Modelling with PBIP + Git

Build and maintain enterprise PBIP datasets fully version-controlled in Git.

Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance.

Manage branching, pull requests and releases via Azure DevOps.

Lakehouse-Aligned Reporting (Gold Layer Only)

Develop semantic models exclusively on top of curated Gold Databricks tables.

Work closely with Data Engineering on schema design and contract-first modelling.

Maintain consistent dimensional modelling aligned to the enterprise Bus Matrix.

High-Performance Power BI Engineering

Optimise performance: aggregations, composite models, incremental refresh, DQ/Import strategy.

Tune Databricks SQL Warehouse queries for speed and cost efficiency.

Monitor PPU capacity performance, refresh reliability and dataset health.

Governance, Security & Standards

Implement RLS/OLS, naming conventions, KPI definitions and calc groups.

Apply dataset certification, endorsements and governance metadata.

Align semantic models with lineage and security policies across the Azure/Databricks estate.

Lifecycle, Release & Best Practice Delivery

Use Power BI Deployment Pipelines for Dev ? UAT ? Prod releases.

Enforce semantic CI/CD patterns with PBIP + Git + Tabular Editor.

Build reusable, certified datasets and dataflows enabling scalable self-service BI.

Adoption, UX & Collaboration

Design intuitive dashboards with consistent UX across multiple business functions.

Support BI adoption through training, documentation and best-practice guidance.

Use telemetry to track usage, performance and improve user experience.

? What We're Looking For

Required Certifications

To meet BI engineering standards, candidates must hold:

PL-300: Power BI Data Analyst Associate

DP-600: Fabric Analytics Engineer Associate

Skills & Experience

Commercial years building enterprise Power BI datasets and dashboards.

Strong DAX and semantic modelling expertise (calc groups, conformed dimensions, role-playing dimensions).

Strong SQL skills; comfortable working with Databricks Gold-layer tables.

Proven ability to optimise dataset performance (aggregations, incremental refresh, DQ/Import).

Experience working with Git-based modelling workflows and PR reviews via Tabular Editor.

Excellent design intuition-clean layouts, drill paths, and KPI logic.

Nice to Have

Python for automation or ad-hoc prep; PySpark familiarity.

Understanding of Lakehouse patterns, Delta Lake, metadata-driven pipelines.

Unity Catalog / Purview experience for lineage and governance.

RLS/OLS implementation experience

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How to Write a Cloud Computing Job Ad That Attracts the Right People

Cloud computing underpins much of the UK’s digital economy. From startups and scale-ups to enterprise organisations and the public sector, cloud platforms enable everything from data analytics and AI to cybersecurity, DevOps and digital services. Yet despite high demand for cloud skills, many employers struggle to attract the right candidates. Cloud job adverts are often flooded with unsuitable applications, while experienced cloud engineers, architects and platform specialists quietly pass them by. In most cases, the problem is not the shortage of cloud talent — it is the quality and clarity of the job advert. Cloud professionals are pragmatic, technically experienced and highly selective. A poorly written job ad signals confusion, unrealistic expectations or a lack of cloud maturity. A well-written one signals credibility, good engineering culture and long-term thinking. This guide explains how to write a cloud computing job ad that attracts the right people, improves applicant quality and strengthens your employer brand.

Maths for Cloud Jobs: The Only Topics You Actually Need (& How to Learn Them)

If you are applying for cloud computing jobs in the UK you might have noticed something frustrating: job descriptions rarely ask for “maths” directly yet interviews often drift into capacity, performance, reliability, cost or security trade-offs that are maths in practice. The good news is you do not need degree-level theory to be job-ready. For most roles like Cloud Engineer, DevOps Engineer, Platform Engineer, SRE, Cloud Architect, FinOps Analyst or Cloud Security Engineer you keep coming back to a small set of practical skills: Units, rates & back-of-the-envelope estimation (requests per second, throughput, latency, storage growth) Statistics for reliability & observability (percentiles, error rates, SLOs, error budgets) Capacity planning & queueing intuition (utilisation, saturation, Little’s Law) Cost modelling & optimisation (right-sizing, break-even thinking, cost per transaction) Trade-off reasoning under constraints (performance vs cost vs reliability) This guide explains exactly what to learn plus a 6-week plan & portfolio projects you can publish to prove it.

Neurodiversity in Cloud Computing Careers: Turning Different Thinking into a Superpower

Cloud computing sits at the heart of modern tech. Almost every digital product runs on someone’s cloud platform – from banking apps & streaming services to AI tools & online shops. Behind those platforms are teams of cloud engineers, architects, SREs, security specialists & more. These roles demand problem-solvers who can think in systems, spot patterns, stay calm under pressure & imagine better ways to build & run infrastructure. That makes cloud computing a natural fit for many neurodivergent people – including those with ADHD, autism & dyslexia. If you are neurodivergent & considering a cloud career, you might have heard messages like “you’re too distracted for engineering”, “too literal for stakeholder work” or “too disorganised for operations”. In reality, many traits that come with ADHD, autism & dyslexia are exactly what cloud teams need. This guide is written for cloud computing job seekers in the UK. We will cover: What neurodiversity means in a cloud context How ADHD, autism & dyslexia strengths map to cloud roles Practical workplace adjustments you can ask for under UK law How to talk about neurodivergence in applications & interviews By the end, you should have a clearer sense of where you might thrive in cloud computing – & how to turn “different thinking” into a professional superpower.