Skip to content

Latest commit

 

History

History
190 lines (168 loc) · 7.8 KB

dump.md

File metadata and controls

190 lines (168 loc) · 7.8 KB

pgii Backup export and import functions

pgii can be used as a PostgreSql import and export tool, which can import and export large tables with millions of data

Generate test data in postgreSql

Table structure and statements for inserting data,5000000pcs

CREATE TABLE t_test(
    ID INT PRIMARY KEY   NOT NULL,
    NAME      TEXT  NOT NULL,
    AGE      INT   NOT NULL,
    ADDRESS    CHAR(50),
    SALARY     REAL
);

insert
    into
    t_test
select
    generate_series(1, 5000000) as key,
    repeat( chr(int4(random()* 26)+ 65), 4),
    (random()*(6 ^2))::integer,
    null,
    (random()*(10 ^4))::integer;

View the generated data

pgi~[yc/pgii]# show tb;
+--------+-----------+------------+------------+-----------+-----------+
| SCHEMA | TABLENAME | TABLEOWNER | TABLESPACE | TABLESIZE | INDEXSIZE |
+--------+-----------+------------+------------+-----------+-----------+
| pgii   | t_test    | postgres   | <nil>      | 356 MB    | 107 MB    |
+--------+-----------+------------+------------+-----------+-----------+

pgi~[yc/pgii]# select count(*) from t_test;
+---------+
|   COUNT |
+---------+
| 5000000 |
+---------+
[Total: 1 Rows]  [RunTimes 0.20s]

DUMP Order

  • dump table: Export the structure and data of the table. If more than 50000 data pieces have been added to the table, a batch import file is generated based on 50000 data pieces✅;
  • dump schema:You can export the creation statement of the selected mode, select the structure and data of the table in the mode, and segment the table with more than 50,000 data in a single table to generate a batch import file according to 50,000 data✅;
  • dump database:todo(developing)

dump <tb|table>

Instructions:

Export the specified table to the corresponding pgi compressed file

Usage:

pgi~[yc/pgii]# dump tb t_test;
 Dump Table Success [t_test].....

In the following figure, a folder dump_table_pgii_1686301979 is generated under the folder where the command is located, and the exported backup files generated by t_test table are displayed under the folder image

dump <sc|schema>

Instructions:

Export the table and table data in the specified mode to the corresponding pgi compressed file. When using this command, you need to select the corresponding mode; otherwise, it cannot be completed;

Usage:

pgi~[yc/pgii]# show tb;
+--------+-----------+------------+------------+-----------+-----------+
| SCHEMA | TABLENAME | TABLEOWNER | TABLESPACE | TABLESIZE | INDEXSIZE |
+--------+-----------+------------+------------+-----------+-----------+
| pgii   | t_test    | postgres   | <nil>      | 356 MB    | 107 MB    |
+--------+-----------+------------+------------+-----------+-----------+
| pgii   | t_user    | postgres   | <nil>      | 128 kB    | 40 kB     |
+--------+-----------+------------+------------+-----------+-----------+

pgi~[yc/pgii]# dump sc;
 Dump Schema Success [pgii]
 Dump Table Success [t_test].....
 Dump Table Success [t_user].....

In the following figure, a folder dump_schema_pgii_1686302845 is generated under the folder where the command resides. Under the folder is the pgi export abbreviated file generated by the related tables in the pgii schema image

Load Order

  • Load table: Export the structure and data of the table. If more than 50000 data pieces have been added to the table, a batch import file is generated based on 50000 data pieces✅;
  • Load schema:You can export the creation statement of the selected mode, select the structure and data of the table in the mode, and segment the table with more than 50,000 data in a single table to generate a batch import file according to 50,000 data✅;
  • Load database:todo(developing)

load <tb|table>

Instructions:

To import the table structure and table data under the specified table backup file path into a new schema or under the schema of its database, the schema must be selected。

Usage:

  • Select another database xc and schema pgii
pgi~[yc/]# use db xc;
  Use Database Success!
pgi~[xc/]# use sc pgii;
  Use Schema Success!
  • Start load The name of the backup folder dump_table_pgii_1686301979 by "dump tb"
pgi~[xc/pgii]# load tb dump_table_pgii_1686301979;
Load Table Success [dump_table_pgii_1686301979/tb_t_test.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_1.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_2.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_3.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_4.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_5.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_6.pgi] Affect Nums:50000
.
.
.
Load Table Success [dump_table_pgii_1686301979/tb_t_test_99.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_100.pgi] Affect Nums:0
  • The query table is as follows. The table has been successfully imported
pgi~[xc/pgii]# show tb;                    
+--------+-----------+------------+------------+-----------+-----------+
| SCHEMA | TABLENAME | TABLEOWNER | TABLESPACE | TABLESIZE | INDEXSIZE |
+--------+-----------+------------+------------+-----------+-----------+
| pgii   | t_test    | postgres   | <nil>      | 356 MB    | 107 MB    |
+--------+-----------+------------+------------+-----------+-----------+
pgi~[xc/pgii]# select count(*) from t_test;        
+---------+
|   COUNT |
+---------+
| 5000000 |
+---------+
[Total: 1 Rows]  [RunTimes 0.20s]

load <sc|schema>

Instructions:

Import the schema and all table structure and table data in the file path of the specified schema backup to another database;

Usage:

  • Select another database xc
pgi~[yc/]# use db xc;
  Use Database Success!
  • Start load The backup folder name dump_schema_pgii_1686302845
pgi~[xc/pgii]# load sc dump_schema_pgii_1686302845;
Load Table Success [dump_table_pgii_1686301979/tb_t_test.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_1.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_2.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_3.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_4.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_5.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_6.pgi] Affect Nums:50000
.
.
.
Load Table Success [dump_table_pgii_1686301979/tb_t_test_99.pgi] Affect Nums:50000
Load Table Success [dump_table_pgii_1686301979/tb_t_test_100.pgi] Affect Nums:0
Load Table Success [dump_schema_pgii_1686302845/tb_t_user.pgi] Affect Nums:1000
  • The query table is as follows. The table has been successfully imported
pgi~[xc/pgii]# show tb;                    
+--------+-----------+------------+------------+-----------+-----------+
| SCHEMA | TABLENAME | TABLEOWNER | TABLESPACE | TABLESIZE | INDEXSIZE |
+--------+-----------+------------+------------+-----------+-----------+
| pgii   | t_test    | postgres   | <nil>      | 356 MB    | 107 MB    |
+--------+-----------+------------+------------+-----------+-----------+

pgi~[xc/pgii]# select count(*) from t_test;        
+---------+
|   COUNT |
+---------+
| 5000000 |
+---------+
[Total: 1 Rows]  [RunTimes 0.20s]

pgi~[xc/pgii]# select count(*) from t_user;
+-------+
| COUNT |
+-------+
|  1000 |
+-------+
[Total: 1 Rows]  [RunTimes 0.10s]