Search icon CANCEL
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering PostGIS

You're reading from   Mastering PostGIS Modern ways to create, analyze, and implement spatial data

Arrow left icon
Product type Paperback
Published in May 2017
Publisher Packt
ISBN-13 9781784391645
Length 328 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Dominik Mikiewicz Dominik Mikiewicz
Author Profile Icon Dominik Mikiewicz
Dominik Mikiewicz
Michal Mackiewicz Michal Mackiewicz
Author Profile Icon Michal Mackiewicz
Michal Mackiewicz
Tomasz Nycz Tomasz Nycz
Author Profile Icon Tomasz Nycz
Tomasz Nycz
George Silva George Silva
Author Profile Icon George Silva
George Silva
Arrow right icon
View More author details
Toc

Table of Contents (10) Chapters Close

Preface 1. Importing Spatial Data FREE CHAPTER 2. Spatial Data Analysis 3. Data Processing - Vector Ops 4. Data Processing - Raster Ops 5. Exporting Spatial Data 6. ETL Using Node.js 7. PostGIS – Creating Simple WebGIS Applications 8. PostGIS Topology 9. pgRouting

Importing data with pgrestore

Just to make the data import complete, it is worth mentioning the restore command. After all, it is not very an uncommon scenario to receive some data in the form of a database, schema, or even a single table backup.

For this scenario, let's create a backup of one of the tables imported before:

pg_dump -h localhost -p 5434 -U postgres -t data_import.earthquakes_subset_with_geom -c -F c -v -b -f earthquakes_subset_with_geom.backup mastering_postgis

Since there was a -v option specified, you should get a similarly verbose output:

pg_dump: reading schemas
pg_dump: reading user-defined tables
pg_dump: reading extensions
pg_dump: reading user-defined functions
pg_dump: reading user-defined types
pg_dump: reading procedural languages
pg_dump: reading user-defined aggregate functions
pg_dump: reading user-defined operators
pg_dump: reading user-defined operator classes
pg_dump: reading user-defined operator families
pg_dump: reading user-defined text search parsers
pg_dump: reading user-defined text search templates
pg_dump: reading user-defined text search dictionaries
pg_dump: reading user-defined text search configurations
pg_dump: reading user-defined foreign-data wrappers
pg_dump: reading user-defined foreign servers
pg_dump: reading default privileges
pg_dump: reading user-defined collations
pg_dump: reading user-defined conversions
pg_dump: reading type casts
pg_dump: reading transforms
pg_dump: reading table inheritance information
pg_dump: reading event triggers
pg_dump: finding extension members
pg_dump: finding inheritance relationships
pg_dump: reading column info for interesting tables
pg_dump: finding the columns and types of table "data_import.earthquakes_subset_with_geom"
pg_dump: flagging inherited columns in subtables
pg_dump: reading indexes
pg_dump: reading constraints
pg_dump: reading triggers
pg_dump: reading rewrite rules
pg_dump: reading policies
pg_dump: reading row security enabled for table "data_import.earthquakes_subset_with_geom"
pg_dump: reading policies for table "data_import.earthquakes_subset_with_geom"
pg_dump: reading large objects
pg_dump: reading dependency data
pg_dump: saving encoding = UTF8
pg_dump: saving standard_conforming_strings = on
pg_dump: dumping contents of table "data_import.earthquakes_subset_with_geom"

Having backed up our table, let's drop the original one:

DROP TABLE data_import.earthquakes_subset_with_geom;

And see if we can restore it:

pg_restore -h localhost -p 5434 -U postgres -v -d mastering_postgis earthquakes_subset_with_geom.backup

You should see a similar output:

pg_restore: connecting to database for restore
pg_restore: creating TABLE "data_import.earthquakes_subset_with_geom"
pg_restore: processing data for table "data_import.earthquakes_subset_with_geom"
pg_restore: setting owner and privileges for TABLE "data_import.earthquakes_subset_with_geom"
pg_restore: setting owner and privileges for TABLE DATA "data_import.earthquakes_subset_with_geom"

At this stage, we have successfully imported data by using the PostgreSQL backup / restore facilities.

If you happen to get some errors on the pg_dump version, do make sure you're using the one appropriate for the DB you are exporting from. You can find it in the bin folder of the PostgreSQL directory.
You have been reading a chapter from
Mastering PostGIS
Published in: May 2017
Publisher: Packt
ISBN-13: 9781784391645
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €18.99/month. Cancel anytime