8

Actually, I want to move one table to another database. But spark don't permit this.

Then, how to copy table by spark-sql?

I already tried this.

SELECT *
INTO table1 IN new_database
FROM old_database.table1

But it was not working.

4 Answers 4

9

maybe try:

CREATE TABLE new_db.new_table AS
SELECT *
FROM old_db.old_table;
Sign up to request clarification or add additional context in comments.

1 Comment

This will not preserve partitioning and storage format (still looking for a way to do so myself)
5

To preserve partitioning and storage format do the following-
Get the complete schema of the existing table by running-

show create table db.old_table

The above query will output the table schema which you can just execute after changing the path name and table name.
Then insert all the rows into the new blank table using-

insert into db.new_table select * from db.old_table

Comments

3

The following snippet will create a new table while preserving the definition of the "old" table.

CREATE TABLE db.new_table LIKE db.old_table;

For more info, check the doc's CREATE TABLE.

1 Comment

This only copies the structure, but not the data.
0

If you're using plain Spark SQL, you can copy the table by doing a CREATE TABLE LIKE then copy the data:

CREATE TABLE new_db.new_table LIKE old_db.old_table;
insert into new_db.new_table select * from old_db.old_table;

If you're using Databricks, you can use CREATE TABLE CLONE:

CREATE OR REPLACE TABLE new_db.new_table DEEP CLONE old_db.old_table;

You have the option to do a deep clone (copies the Parquet files) or a shallow clone (shares Parquet files between source and target), see ref.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.