admin 管理员组

文章数量: 1086019

I am connecting a ssms instance present on my personal computer with databricks using jdbc.

here is my code:

database_host = "192.168.xxxxx.xxxxx"
database_port = "1433" # update if you use a non-default port
database_name = "PRACTICE"
table = "TblGender"
user = "sa"
password = "somepassword"

url = f"jdbc:sqlserver://{database_host}:{database_port};database={database_name}"

remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", url)
.option("dbtable", table)
.option("user", user)
.option("password", password)
.load()
)

remote_table.show()

I have enabled TCP/IP and Set up the inbound rule and still i am not able to connect.

I am getting this error:

INTERNAL: The TCP/IP connection to the host 192.168.xxx.xxx, port 1433 has failed. Error: "Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".

I am not sure what is incorrect in this code but wanted to ask if:

  1. Host name is correct or not as i am using ipv4 address fetched from running ipconfig in cmd.
  2. If i am using the servername i am getting the same error.

Help me out to identify the problem.

I am connecting a ssms instance present on my personal computer with databricks using jdbc.

here is my code:

database_host = "192.168.xxxxx.xxxxx"
database_port = "1433" # update if you use a non-default port
database_name = "PRACTICE"
table = "TblGender"
user = "sa"
password = "somepassword"

url = f"jdbc:sqlserver://{database_host}:{database_port};database={database_name}"

remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", url)
.option("dbtable", table)
.option("user", user)
.option("password", password)
.load()
)

remote_table.show()

I have enabled TCP/IP and Set up the inbound rule and still i am not able to connect.

I am getting this error:

INTERNAL: The TCP/IP connection to the host 192.168.xxx.xxx, port 1433 has failed. Error: "Connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall.".

I am not sure what is incorrect in this code but wanted to ask if:

  1. Host name is correct or not as i am using ipv4 address fetched from running ipconfig in cmd.
  2. If i am using the servername i am getting the same error.

Help me out to identify the problem.

Share Improve this question edited Mar 27 at 14:03 Thom A 96.3k11 gold badges61 silver badges95 bronze badges asked Mar 27 at 14:01 Upendra DwivediUpendra Dwivedi 112 bronze badges 3
  • 1 SSMS is just an IDE (like) application that is used with several products such as (but not limited to) SQL Server, Azure SQL Edge and Azure Synapse. SSMS is likely irrelevant to the question here. What product are you actually using for your data engine? Tagging [ssms] for your data engine is like tagging [visual-studio] for your programming language; it doesn't actually really tell us anything. – Thom A Commented Mar 27 at 14:02
  • Where is this databricks running? or your local computer or somewhere else? – siggemannen Commented Mar 27 at 17:18
  • Databricks is hosted on Azure. For a demo i want to create a direct connection with on-prem ssms instance. – Upendra Dwivedi Commented Mar 28 at 5:18
Add a comment  | 

1 Answer 1

Reset to default 0

You can connect your on-premises SQL Server to Databricks using the JDBC driver available in Databricks Runtime. Below are the high-level steps to establish this connection:

Steps to Connect Azure Databricks to an On-Premises SQL Server

  • Set up a Transit Virtual Network (VNet) with an Azure Virtual Network Gateway. This enables secure communication between Azure Databricks and your on-premises network.

  • Peer the Azure Databricks Virtual Network with the Transit VNet. VNet peering allows Databricks traffic to be routed through the transit VNet to your on-premises environment.

  • Create User-Defined Routes (UDRs) and Associate Them with Databricks Subnets. UDRs ensure that traffic destined for the on-premises network is correctly routed through the transit VNet.

  • Use tools like telnet, nc, or PowerShell to verify that your Databricks clusters can connect to the SQL Server. Run a test JDBC connection in Databricks to confirm connectivity.

Reference: Connect your Azure Databricks Workspace to your on-premises network

本文标签: ssmsonPrem sql server connection using jdbc driver in databricks workspaceStack Overflow