Joey's Game Room

  • Home
  • Joey's Game Room

Joey's Game Room I Play chess, summoners war, mobile legends, league of legends, flyff and genshin impact.
(18)

Data Engineering  ETL (extract transform load) With Azure Data Factoryby - joey gomez de jesus jr📞whatsapp +639602727215...
19/08/2025

Data Engineering ETL (extract transform load) With Azure Data Factory
by - joey gomez de jesus jr
📞whatsapp +639602727215
Donate on Paypal - https://paypal.me/awareit
Donate on Gcash - 0994009301
Business Insight Engineering Free Training
🛠️ Part 1 – Azure Deployment Environment (ADE)
https://youtu.be/HvA3nY1FkMI
🚪 Part 2 – Intro to Azure Databricks (14-day trial)
https://www.youtube.com/watch?v=JFxnBOzoxIk
🤖 Part 3 – AI-Assisted Dev with Databricks, SQL Server, PySpark
https://www.youtube.com/watch?v=nuUyia7Hoag
🔄 Part 4 – Load SQL Server Data into Databricks with PySpark + AI
https://www.youtube.com/watch?v=k-3QW_XqFMg
🔐 Part 5 – Storing Secrets in Azure Key Vault for Databricks
https://www.youtube.com/watch?v=IS0cHXlOWQI
🛠️ Part 6 – Scheduling Azure Databricks jobs
https://youtu.be/X2nUjBIi6GQ
🧩 Part 7 – Connecting Database Development Tools to Azure Databricks Datawarehouse
https://www.youtube.com/watch?v=VLBICcmr2NE
📊 Part 8 - Charts and Graph in Azure databricks
https://youtu.be/NGxMRzetyug
🏭 Part 9 - Data Engineering ETL (extract transform load) With Azure Data Factory
https://youtu.be/YmlA2Dg12GU
Azure Data Factory (ADF) is a cloud-based data integration service from Microsoft that lets you build, orchestrate, and automate data pipelines across diverse systems—whether on-premises or in the cloud.
🛠️ What It Does
- ETL/ELT Workflows: Extract, transform, and load data from multiple sources into a centralized destination
- Data Movement: Seamlessly copy data between databases, APIs, file systems, and cloud storage
- Data Transformation: Clean, enrich, and reshape data using built-in data flows or external compute services like Azure Databricks
- Scheduling & Automation: Trigger pipelines based on time, events, or conditions—fully automated and scalable
- Monitoring & Management: Track pipeline health, performance, and errors through a visual dashboard
🌐 Key Features
- No-code/low-code interface for building complex workflows
- Broad connectivity to over 90 data sources (SQL, REST, Blob, Salesforce, etc.)
- Integrated security via Entra ID and role-based access control
- Custom triggers for real-time or batch processing
- Compression, validation, and preview tools during data copy operations
🧩 Real-World Use Case
Imagine a health coalition pulling data from hospital systems, survey platforms, and cloud logs. Azure Data Factory can:
- Ingest all that data
- Clean and normalize it
- Load it into a dashboard for real-time advocacy insights
🧭 Step-by-Step ETL Guide: CSV → Blob → Azure SQL
🔧 Prerequisites
- Azure Subscription
- Azure Blob Storage with CSV uploaded
- Azure SQL Database with destination table created
- Azure Data Factory instance
🪄 1. Create Linked Services
These are your connection bridges.
- Blob Storage Linked Service
- Go to ADF Studio → Manage → Linked Services → New
- Choose Azure Blob Storage
- Authenticate via account key or managed identity
- SQL Database Linked Service
- Choose Azure SQL Database
- Provide server name, database name, credentials
📁 2. Define Datasets
These describe your source and destination formats.
- Source Dataset (CSV)
- Type: Delimited Text
- Point to the blob container and file path
- Define column delimiter, encoding, and schema (optional)
- Sink Dataset (SQL Table)
- Type: Azure SQL Table
- Point to the destination table
- Define schema mapping if needed
🔄 3. Build the Pipeline
This is your ritual flow.
- Go to Author → New Pipeline
- Add a Copy Data activity
- Source: Blob CSV dataset
- Sink: SQL dataset
- Optional: Add Mapping tab to align columns
- (Optional) Add Data Flow if you need transformations
⏰ 4. Trigger Ex*****on
Choose how and when your ritual runs.
- Manual Trigger: Click Debug or Trigger Now
- Scheduled Trigger: Add a Time-based trigger
- Event Trigger: Trigger when a new blob is uploaded
📊 5. Monitor Pipeline
Track your movement’s pulse.
- Go to Monitor tab in ADF Studio
- View run history, success/failure, duration
- Drill into activity logs for diagnostics
🧠 Bonus: SQL Table Creation Example
CREATE TABLE Customers (
CustomerID INT,
Name VARCHAR(100),
Email VARCHAR(100),
JoinDate DATE
);
📚 Reference Tutorials
- Microsoft’s official walkthrough
- SQLShack’s practical guide
- GitHub demo pipeline

- "From Raw to Ritual: ETL Data Engineering with Azure Data Factory"
- "Pipeline Power: Transforming Data with Azure Data Factory"
-- "ETL Unleashed: Scalable Data Engineering in the Cloud"
- "Azure Data Factory Masterclass: ETL for Real-Time Impact"
- "Blob to Dashboard: ETL Magic with Azure Data Factory"
- "Orchestrate the Flow: Data Engineering with Azure’s ETL Engine"

https://youtu.be/YmlA2Dg12GU

Data Engineering ETL (extract transform load) With Azure Data Factoryby - joey gomez de jesus jr📞whatsapp +639602727215Donate on Paypal - https://paypal.me...

Data visualization with Azure Databricks Charts and Graphby joey gomez de jesus📞whatsapp +639602727215Introduction to Da...
15/08/2025

Data visualization with Azure Databricks Charts and Graph
by joey gomez de jesus
📞whatsapp +639602727215

Introduction to Databricks Charts and Graphs
Visualizing data is a key step in turning raw information into meaningful insights. In Databricks, you can create charts and graphs directly within your notebooks or SQL queries, allowing you to quickly explore patterns, trends, and outliers without switching to another tool.

Why Use Charts in Databricks?
Immediate insights – View data visually right after running a query.
Interactive exploration – Filter, group, and adjust visualization settings on the fly.

Collaboration-ready – Share your Databricks notebooks with teammates and keep both code and visuals in one place.
Common Chart Types in Databricks:
Bar Chart – Compare categories or discrete values.
Line Chart – Show trends over time.
Scatter Plot – Explore relationships between variables.
Pie Chart – Visualize proportions of a whole.
Map – Plot geographic data.

Where You Can Create Charts:
SQL Editor – Run a query, then click the chart icon to switch from table view to a visualization.
Notebooks – Use %sql or Python plotting libraries (like matplotlib or plotly) for advanced visuals.
Workflow Example:

Write a SQL query to retrieve your data.
Click the “Visualization” tab in the results pane.
Choose a chart type and configure settings (X-axis, Y-axis, aggregations, colors).

Save the visualization so it appears next time you run the cell.
With Databricks charts and graphs, you can go from data to decisions without leaving the platform. This keeps your analysis flow smooth and your insights clear.



Data visualization with Azure Databricks Charts and Graphby joey gomez de jesus📞whatsapp +639602727215Introduction to Databricks Charts and GraphsVisualizin...

https://youtu.be/VLBICcmr2NEConnecting Database Development Tools to Azure Databricks Datawarehousedbeaver,vscode and sq...
14/08/2025

https://youtu.be/VLBICcmr2NE
Connecting Database Development Tools to Azure Databricks Datawarehouse
dbeaver,vscode and sqlworkbench/j for adb datawarehouse
- by joey gomez de jesus jr
Donate on Paypal - https://paypal.me/awareit
Business Insight Engineering Free Training

🛠️ Part 1 – Azure Deployment Environment (ADE)
https://youtu.be/HvA3nY1FkMI
🚪 Part 2 – Intro to Azure Databricks (14-day trial)
https://www.youtube.com/watch?v=JFxnBOzoxIk
🤖 Part 3 – AI-Assisted Dev with Databricks, SQL Server, PySpark
https://www.youtube.com/watch?v=nuUyia7Hoag
🔄 Part 4 – Load SQL Server Data into Databricks with PySpark + AI
https://www.youtube.com/watch?v=k-3QW_XqFMg
🔐 Part 5 – Storing Secrets in Azure Key Vault for Databricks
https://www.youtube.com/watch?v=IS0cHXlOWQI
🛠️ Part 6 – Scheduling Azure Databricks jobs
https://youtu.be/X2nUjBIi6GQ
🧩 Part 7 – Connecting Database Development Tools to Azure Databricks Datawarehouse
https://www.youtube.com/watch?v=VLBICcmr2NE

🧩 Connect DBeaver to Azure Databricks SQL Warehouse

✅ Requirements

DBeaver Community Edition (CE) installed
Azure Databricks JDBC Driver downloaded
Access to an Azure Databricks SQL Warehouse or cluster
🛠️ Step-by-Step Setup
Install DBeaver
Download from DBeaver’s official site
Or use package managers:
Windows: choco install dbeaver
macOS: brew install --cask dbeaver-community
Download and Configure JDBC Driver
Get the Databricks JDBC Driver
Extract the .jar file (e.g., DatabricksJDBC42.jar)
Add Driver in DBeaver
Open DBeaver → Go to Database. Then Driver Manager
Click New → Name it “Databricks”
Under the Libraries tab, click Add File and select the .jar file
Under Settings tab:
Driver Class: com.databricks.client.jdbc.Driver
URL Template:
jdbc:databricks://ServerHostname:443;transportMode=http;ssl=1;httpPath=HTTPPath;AuthMech=3;UID=token;PWD=PersonalAccessToken
Create Connection
Go to Database, New Connection
Choose the “Databricks” driver you just created
Fill in:
Server Hostname: Found in your Databricks workspace
HTTP Path: From your SQL Warehouse connection details
Authentication: Use a Personal Access Token
🔐 Notes
Metadata queries (e.g., listing tables) may incur charges. You can disable them in DBeaver settings to avoid unexpected costs.
For advanced setups (SSH, proxies, secure storage), see DBeaver’s Databricks documentation.
🧩 Connect SQL Workbench/J to Azure Databricks SQL Warehouse
✅ Requirements
SQL Workbench/J installed (tested with Build 130)
Databricks JDBC Driver
Azure Databricks SQL Warehouse or cluster
Personal Access Token or Microsoft Entra ID token
🛠️ Step-by-Step Setup
Install SQL Workbench/J
Download from SQL Workbench/J official site
Ensure Java is installed (e.g., Zulu OpenJDK 21.0.1)
Download JDBC Driver
Extract the .jar file from the downloaded ZIP
Configure Driver in SQL Workbench/J
Launch SQL Workbench/J
Go to Select Connection Profile dialog → Manage Drivers
Click New and name it “Databricks”
In the Library field, select the .jar file
Ensure Classname is set
Create Connection Profile
Click New Profile
Name your profile (e.g., “Azure Databricks”)
Select the Databricks driver
In the URL field, use this format:
jdbc:databricks://ServerHostname:443;transportMode=http;ssl=1;httpPath=HTTPPath;AuthMech=3;UID=token;PWD=PersonalAccessToken

Replace placeholders with your actual values
Test and Save
Click Test to verify the connection
Click OK to save
🔐 Security Tip
Use service principal tokens for automated tools instead of workspace user tokens for better security.

Connecting Database Development Tools to Azure Databricks Datawarehousedbeaver,vscode and sqlworkbench/j for adb datawarehouse- by joey gomez de jesus jrDona...

PART 6 – SCHEDULING AZURE DATABRICKS JOBSSchedule jobs in   using simple intervals or cron syntax like 0 0 5,21 ? * *. G...
11/08/2025

PART 6 – SCHEDULING AZURE DATABRICKS JOBS
Schedule jobs in using simple intervals or cron syntax like 0 0 5,21 ? * *. Go to Jobs & Pipelines, add a trigger, set time zones. Automate notebooks, scripts, or pipelines for consistent workflows. Minimum interval: 10 seconds. Ideal for and .

Donate on Paypal - https://paypal.me/awareit
Business Insight Engineering Free Training
🛠️ Part 1 – Azure Deployment Environment (ADE)
https://youtu.be/HvA3nY1FkMI
🚪 Part 2 – Intro to Azure Databricks (14-day trial)
https://www.youtube.com/watch?v=JFxnB...
🤖 Part 3 – AI-Assisted Dev with Databricks, SQL Server, PySpark
https://www.youtube.com/watch?v=nuUyi...
🔄 Part 4 – Load SQL Server Data into Databricks with PySpark + AI
https://www.youtube.com/watch?v=k-3QW...
🔐 Part 5 – Storing Secrets in Azure Key Vault for Databricks
https://www.youtube.com/watch?v=IS0cH...
🛠️ Part 6 - Schedulign Azure Databricks jobs
https://youtu.be/X2nUjBIi6GQ
🛠️ Creating Jobs in Azure Databricks
Jobs in Databricks are the orchestration backbone for automating notebooks, scripts, SQL queries, and pipelines. They allow you to define tasks, set dependencies, and schedule executions.
🔹 Steps to Create a Job
1. Navigate to Jobs & Pipelines in the Databricks workspace sidebar.
2. Click “Create Job”.
3. Define the Job Name and select a Notebook, Python script, JAR, or dbt project as the task.
4. Choose a compute resource:
• Serverless compute
• Jobs compute
• All-purpose cluster
5. Configure task parameters, email alerts, and retry policies.
6. Add additional tasks to build a DAG (Directed Acyclic Graph) structure if needed.
📘 Learn more: Configure and edit Lakeflow Jobs
⏰ Scheduling ETL Workflows
Databricks supports multiple scheduling mechanisms to automate ETL workflows:
🔹 Time-Based Scheduling
• Use simple schedules (e.g., every 12 hours).
• Or define advanced schedules using Quartz Cron syntax.
• Set time zones, start times, and intervals.
🔹 Event-Based Triggers
• Trigger jobs when files arrive in Unity Catalog storage.
• Use continuous triggers to rerun jobs upon completion or failure.
🔹 Manual & External Triggers
• Run jobs manually or via orchestration tools like Apache Airflow.
everyone

Donate on Paypal - https://paypal.me/awareitBusiness Insight Engineering Free Training🛠️ Part 1 – Azure Deployment Environment (ADE) https://youtu.be/HvA3n...

Preview of SQL server 2025 with AI Aided development
07/08/2025

Preview of SQL server 2025 with AI Aided development

Preview Of SQL SERVER 2025 and AI Integrated Development in SQL server

Stocks Vs Crypto No more! We can hace the best of both workds
28/07/2025

Stocks Vs Crypto No more!
We can hace the best of both workds

Stocks VS Crypto No More! Now We can HAve the Best Of BOTH worldsstocks in the crypto world g Trade with BYbithttp...

CAN YOU MAKE A LIVING TRADING CRYPTO?
26/07/2025

CAN YOU MAKE A LIVING TRADING CRYPTO?

Can You Make A Living Trading Crypto And How Much Money Do you Need

Sign up to consph and get rewards. :)
24/07/2025

Sign up to consph and get rewards. :)

Address


Alerts

Be the first to know and let us send you an email when Joey's Game Room posts news and promotions. Your email address will not be used for any other purpose, and you can unsubscribe at any time.

  • Want your business to be the top-listed Media Company?

Share