Just to make the data import complete, it is worth mentioning the restore command. After all, it is not very an uncommon scenario to receive some data in the form of a database, schema, or even a single table backup.
For this scenario, let's create a backup of one of the tables imported before:
pg_dump -h localhost -p 5434 -U postgres -t data_import.earthquakes_subset_with_geom -c -F c -v -b -f earthquakes_subset_with_geom.backup mastering_postgis
Since there was a -v option specified, you should get a similarly verbose output:
pg_dump: reading schemas
pg_dump: reading user-defined tables
pg_dump: reading extensions
pg_dump: reading user-defined functions
pg_dump: reading user-defined types
pg_dump: reading procedural languages
pg_dump: reading user-defined aggregate functions
pg_dump: reading user-defined operators
pg_dump: reading user-defined operator classes
pg_dump: reading user-defined operator families
pg_dump: reading user-defined text search parsers
pg_dump: reading user-defined text search templates
pg_dump: reading user-defined text search dictionaries
pg_dump: reading user-defined text search configurations
pg_dump: reading user-defined foreign-data wrappers
pg_dump: reading user-defined foreign servers
pg_dump: reading default privileges
pg_dump: reading user-defined collations
pg_dump: reading user-defined conversions
pg_dump: reading type casts
pg_dump: reading transforms
pg_dump: reading table inheritance information
pg_dump: reading event triggers
pg_dump: finding extension members
pg_dump: finding inheritance relationships
pg_dump: reading column info for interesting tables
pg_dump: finding the columns and types of table "data_import.earthquakes_subset_with_geom"
pg_dump: flagging inherited columns in subtables
pg_dump: reading indexes
pg_dump: reading constraints
pg_dump: reading triggers
pg_dump: reading rewrite rules
pg_dump: reading policies
pg_dump: reading row security enabled for table "data_import.earthquakes_subset_with_geom"
pg_dump: reading policies for table "data_import.earthquakes_subset_with_geom"
pg_dump: reading large objects
pg_dump: reading dependency data
pg_dump: saving encoding = UTF8
pg_dump: saving standard_conforming_strings = on
pg_dump: dumping contents of table "data_import.earthquakes_subset_with_geom"
Having backed up our table, let's drop the original one:
DROP TABLE data_import.earthquakes_subset_with_geom;
And see if we can restore it:
pg_restore -h localhost -p 5434 -U postgres -v -d mastering_postgis earthquakes_subset_with_geom.backup
You should see a similar output:
pg_restore: connecting to database for restore
pg_restore: creating TABLE "data_import.earthquakes_subset_with_geom"
pg_restore: processing data for table "data_import.earthquakes_subset_with_geom"
pg_restore: setting owner and privileges for TABLE "data_import.earthquakes_subset_with_geom"
pg_restore: setting owner and privileges for TABLE DATA "data_import.earthquakes_subset_with_geom"
At this stage, we have successfully imported data by using the PostgreSQL backup / restore facilities.
If you happen to get some errors on the pg_dump version, do make sure you're using the one appropriate for the DB you are exporting from. You can find it in the bin folder of the PostgreSQL directory.