Mapping Vietnam: Difference between revisions
Brian Wilson (talk | contribs) m Georeferencing maps of Vietnam moved to Mapping Vietnam |
Brian Wilson (talk | contribs) |
||
(55 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
== Obtain | |||
== Data sources == | |||
[http://www.rjsmith.com/topo_map.html#clist2a Ray's Map Room] a collection of scanned topos | |||
Place names from NIMA: ftp://ftp.nga.mil/pub2/gns_data/ vm, cb, la files | |||
[[Converting a NIMA text file into geojson]] with a Python script | |||
== The vision == | |||
Build a web map server for the [http://www.vhpa.org/ Vietnam Helicopter Pilots Association] to share current and historic maps of Vietnam. | |||
This is a first person project. The whole point is to involve directly the crews of the helicopters who served in Vietnam. We want to write our history accurately and not leave it to academics to re-write it the way they think it should have been. | |||
The Vietnam Archives at Texas Tech University is our official archives and they have agreed to provide us with information to include on overlays. | |||
We have database tables of points to add like helicopter crashes, ground casualties, battles, heliports, base camps, fire support bases, etc. Some of these would be date range specific. As example a specific fire support base or battle may have only existed from May 1968 until October 1968. | |||
Each point can have back up material for pop up windows like an image of the approach plates (we have about 300 of those) for a specific heliport or picture of the crashed helicopter or details on the battle in ASCII or HTML or PDF. We are also thinking some video too. As an example, I have video of burning helicopters from a sapper attack http://www.242ashc.org/vhpa.mpg so an icon on a layer would be the window to see that video. | |||
On top of this we would like for users to be able to add information, pictures and video either directly or through an administrator. | |||
== The implementation == | |||
Your input and comments are welcome. Either edit the page directly or use the discussion link. | |||
Site will use UTM projection and have an MGRS grid system | |||
The site will have base map layer created from Vietnam war era DMA (Defense Mapping Agency) maps that have been scanned and archived by USGS. | |||
The project will be rolled out on a Godaddy server. | |||
* We can use Openlayers to implement the browser portion. | |||
* There can be several vector layers for the overlays, | |||
* Tiled maps created from the DMA/USGS GeoPDF files, | |||
* A Google Maps layer to add current air photos | |||
The site needs user accounts so that we can create roles to control access. | |||
* An ''admin'' user can edit any data. | |||
* A ''confirmed user'' can edit only some of the data and can add comments. | |||
* ''Anonymous'' users can only read and view site contents. | |||
=== Server options === | |||
'''Vector layers''' - | |||
Vector layers (points, lines, polygons) could be served from either [[Mapserver]] or [[GeoServer]]. | |||
Vector data will be stored in a [[PostGIS]] database for easy editing and attribution from the Web site. Table data can be stored in the same [[PostgreSQL]] server that serves the PostGIS data. | |||
'''Tile layers''' -- The GeoPDF files are turned into tiles, The links at the top of this page just serve tiles stored an Apache server. Not sure what we should use ultimately. | |||
'''Aerial photos''' -- current air photos are pulled in from Google maps. | |||
* Create a list of what specific vector layers will be used. | |||
Suggestions for layers | |||
* There should be a layer that anyone can edit. | |||
* There should probably be a layer created from archives. | |||
=== Server software === | |||
[[Featureserver]] will allow editing vector layers. But it seems to be frozen in time, no updates in a long time. [[MapFish]] seems like a better candidate to me at the moment. | |||
I have taken out [[Mapserver]] for a spin a few times, but it does not support WFS-T so it is not easy to get data from the browser back into the database. | |||
[[GeoServer]] supports WFS-T, so I am going to try it. It requires Tomcat. Godaddy supports Tomcat. Hostmonster (where wildsong.biz lives) does not, so I will be installing it at Tektonic where hupi.org lives. | |||
=== Obtain source maps === | |||
'''I need the raster maps.''' | |||
If you already know the numbers of the maps you need, | |||
# Go to USGS site http://store.usgs.gov/ | # Go to USGS site http://store.usgs.gov/ | ||
Line 8: | Line 82: | ||
# Save the PDF file | # Save the PDF file | ||
The maps are already georeferenced but you have to | Samples for testing | ||
* 1501ANC4810 | |||
* 1501ANC4811 | |||
* 1501ANC4815 | |||
You can also use the search features to find what you need. | |||
'''For testing I am also going to get some vector data.''' | |||
From GIS Data Depot: populated places, aeronautical, roads, railroads | |||
That should be enough for right now. | |||
These files are in E00 format so I am going to describe loading them on this page: [[Loading data into PostGIS]] | |||
=== Working with GeoPDFs === | |||
The GeoPDFs downloaded from USGS were scanned from paper maps, converted to a digital form (probably TIFF) and then georeferenced. Then they were converted to GeoPDF format using TerraGo Publisher. | |||
Each scan covers only a relatively small area on the ground. So to create a complete seamless map for use on the web, we must | |||
# Convert GeoPDFs to some format that we can further process. | |||
# Remove the collars (the border containing legend, dates, etc). | |||
# Process the set of resultant files into tiles that can be served. | |||
==== Step one: Reading GeoPDF ==== | |||
If you are using ArcMap you can use "TerraGo Publisher for ArcGIS" ($2700) | |||
to import GeoPDFs directly. At this point, you can treat the GeoPDF data as you would any other ArcMap data source. Unfortunately $2700 is outside my budget. | |||
Another excellent commercial product is [http://www.globalmapper.com/ Global Mapper] (retails for $349). It can read the PDFs and export them as GeoTIFFs. An added plus, it has code to remove the collars and can also reproject the files if needed. I am told it is scriptable too. | |||
You could ignore the geospatial encoding in the source PDFs and convert them into a format (TIFF for example) readable by ArcMap and then georeference it, but that's a lot of work given the PDF is already georeferenced! | |||
The very latest version of [http://www.gdal.org/ GDAL] (1.8.0) can read GeoPDFs. It's free and I have worked with it in the past, so I am giving it a try. | |||
==== Recipe to convert a GeoPDF to a GeoTIFF using GDAL ==== | |||
% '''gdal_translate -of GTiff -co "TILED=YES" -co "TFW=YES" 1501ANC4815_geo.pdf 1501ANC4815_geo.tif''' | |||
This generated a 47 MB TIFF that is 4470 x 3375 pixels. | |||
You can learn more about GeoTIFF options here: http://www.gdal.org/frmt_gtiff.html A good one to use might be "-co "COMPRESS=LZW" | |||
==== Step two: Removing collar ==== | |||
With (the Windows program) Global Mapper this is as simple as checking an option box for the layer. But it's a Windows-only program. | |||
# Use gdalinfo to read the header to find the location of the collar. | |||
# Use a script to find the corners (there can be many points defining the edges but I only want the corners) | |||
# Convert the raster to a GCS using gdalwarp | |||
# Convert the extracted corners to GCS using gdaltransform. | |||
# Remove the collar using gdal_translate | |||
Note that it is also possible to simply trim away using trial and error, | |||
for example, | |||
% '''gdal_translate -srcwin 350 0 4120 2750 -of GTiff -co "TILED=YES" 1501ANC4815_geo.pdf 1501ANC4815_geo.tif''' | |||
==== 1 Detect collar using gdalinfo ==== | |||
According to the "Acrobat Supplement to the ISO 32000" there is a definition in the file to tell me where the map lives. The gdalinfo command reveals the NEATLINE setting: | |||
NEATLINE=POLYGON (( | |||
445035.80158801417565 999277.556482305750251, ul | |||
444993.235997625160962 991902.349403560161591, | |||
444864.738057852664497 884201.712978136376478, ll | |||
452159.580137342331 884208.817744017927907, | |||
585964.54743435990531 884289.527591207530349, | |||
614300.221590773784555 884383.269830735283904, lr | |||
614190.235319046885706 999293.477391553344205, ur | |||
605175.140901734353974 999218.552309990860522, | |||
553978.789515793323517 999300.979716640315019, | |||
505958.457250406034291 999353.427943945163861, | |||
451172.635888226970565 999382.750743734533899, | |||
445035.722449889173731 999360.237513555795886, | |||
445035.80158801417565 999277.556482305750251, | |||
445035.80158801417565 999277.556482305750251, | |||
445035.80158801417565 999277.556482305750251)) | |||
The corners of the image with collar included are given as: | |||
Corner Coordinates: | |||
Upper Left ( 429999.053, 999725.915) (104d21'46.72"E, 9d 2'39.27"N) | |||
Lower Left ( 430135.804, 856853.093) (104d21'58.76"E, 7d45' 6.78"N) | |||
Upper Right ( 619287.590, 999910.271) (106d 5' 7.80"E, 9d 2'41.48"N) | |||
Lower Right ( 619424.341, 857037.449) (106d 4'59.36"E, 7d45' 9.52"N) | |||
Center ( 524711.697, 928381.682) (105d13'28.20"E, 8d23'57.68"N) | |||
The corners correspond to the edges of the "paper". To visualize things, | |||
I put the neatline points and the corners into a text file, then loaded the map TIFF into QGIS, and used the "Add delimited text file" plugin to add the text file to the map. | |||
The points all fell onto the neatline. | |||
==== 2 Find the corners ==== | |||
I wrote a little Python to sort them. It looks like this. | |||
<pre> | |||
#!/usr/bin/env python | |||
import math | |||
neatline = [ | |||
(445035.80158801417565, 999277.556482305750251), | |||
(444993.235997625160962, 991902.349403560161591), | |||
(444864.738057852664497, 884201.712978136376478), | |||
(452159.580137342331, 884208.817744017927907), | |||
(585964.54743435990531, 884289.527591207530349), | |||
(614300.221590773784555, 884383.269830735283904), | |||
(614190.235319046885706, 999293.477391553344205), | |||
(605175.140901734353974, 999218.552309990860522), | |||
(553978.789515793323517, 999300.979716640315019), | |||
(505958.457250406034291, 999353.427943945163861), | |||
(451172.635888226970565, 999382.750743734533899), | |||
(445035.722449889173731, 999360.237513555795886), | |||
(445035.80158801417565, 999277.556482305750251), | |||
(445035.80158801417565, 999277.556482305750251), | |||
(445035.80158801417565, 999277.556482305750251) ] | |||
UpperLeft =( 429999.053, 999725.915) #(104d21'46.72"E, 9d 2'39.27"N) | |||
LowerLeft =( 430135.804, 856853.093) #(104d21'58.76"E, 7d45' 6.78"N) | |||
UpperRight =( 619287.590, 999910.271) #(106d 5' 7.80"E, 9d 2'41.48"N) | |||
LowerRight =( 619424.341, 857037.449) #(106d 4'59.36"E, 7d45' 9.52"N) | |||
Size = (4470, 3375) | |||
ul = neatline[0] | |||
ur = neatline[0] | |||
ll = neatline[0] | |||
lr = neatline[0] | |||
def dist(p1, p2): | |||
dx = math.pow(p2[0] - p1[0], 2) | |||
dy = math.pow(p2[1] - p1[1], 2) | |||
return math.sqrt(dx + dy) | |||
def closer(corner, p1, p2): | |||
if dist(corner, p1) < dist(corner, p2): | |||
return p1 | |||
return p2 | |||
for n in neatline: | |||
ul = closer(UpperLeft, ul, n) | |||
ur = closer(UpperRight, ur, n) | |||
ll = closer(LowerLeft, ll, n) | |||
lr = closer(LowerRight, lr, n) | |||
fp = open("/home/bwilson/Downloads/nl.csv", "w") | |||
fp.write("x, y\n") | |||
fp.write("%f, %f\n" % ul) | |||
fp.write("%f, %f\n" % ur) | |||
fp.write("%f, %f\n" % ll) | |||
fp.write("%f, %f\n" % lr) | |||
fp.close() | |||
</pre> | |||
It generates a file containing the neatline corners, which looks like this | |||
x, y | |||
445035.722450, 999360.237514 | |||
614190.235319, 999293.477392 | |||
444864.738058, 884201.712978 | |||
614300.221591, 884383.269831 | |||
==== 3 Reproject the raster ==== | |||
Conveniently this command converts it from GeoPDF to a GeoTIFF at the same time. | |||
gdalwarp -t_srs EPSG:4326 1501ANC4810_geo.pdf -of GTiff 1501ANC4810_4326.tif | |||
==== 4 Convert the extracted corners to GCS using gdaltransform ==== | |||
[http://gdal.org/gdaltransform.html gdaltransform] command | |||
gdaltransform -s_srs EPSG:3148 -t_srs EPSG:4326 < nl.csv | |||
104.496065478207 9.04334959786241 -12.9164893552661 | |||
106.035159775414 9.04163128037419 -24.1458563385531 | |||
104.495875664344 8.00173981131893 -17.4262739224359 | |||
106.033366794208 8.0023968002259 -28.6638822425157 | |||
==== 5 Remove the collar using gdal_translate ==== | |||
Using map coords (using data from map): -projwin ''ulx uly lrx lry'' | |||
% '''gdal_translate -projwin 104.496065478207 9.04334959786241 106.033366794208 8.0023968002259 -of GTiff -co "TILED=YES" 1501ANC4815_4326.tif 1501ANC4815_nocollar.tif''' | |||
Put it into Google maps projection | |||
gdalwarp -t_srs EPSG:900913 1501ANC4815_nocollar.tif -of GTiff 1501ANC4815_google.tif | |||
I can then proof this by loading it into QGIS.. it works. | |||
Now I need to automate it, the above is a bit clumsy. | |||
==== Step three: make tiles ==== | |||
Turning one TIFF into a map | |||
% '''gdal2tiles.py -t Vietnam -g ABQIAAAAGRHc9gAG3W9UhaqQD0pvExSVvaIHsSWhBu9dnlvZR7nHQNV82hT1HFUt0HcccsqJ2Z2e8femz7wrsw -u http://wildsong.biz/map/ 1501ANC4811_4326.tif vietnam''' | |||
This creates a set of PNG files and HTML files in the output directory "vietnam". I then copy them up to the server and you can see them at http://wildsong.biz/map/vietnam | |||
But I want to have multiple input files. I can merge them and then tile the output file. I also want to zoom in more than allowed by default so I make more zoom levels. | |||
% '''export GKEY="ABQIAAAAGRHc9gAG3W9UhaqQD0pvExSVvaIHsSWhBu9dnlvZR7nHQNV82hT1HFUt0HcccsqJ2Z2e8femz7wrsw"''' | |||
% '''export URL="http://wildsong.biz/map/"''' | |||
% '''gdal_merge.py -o vietnam.tif -of GTiff 1501ANC4810_4326.tif 1501ANC4811_4326.tif 1501ANC4815_4326.tif'''' | |||
% '''gdal2tiles.py -t "Vietnam" -g $GKEY -u $URL -z 7-14 vietnam.tif vietnam''' | |||
=== More data === | |||
Additional map layers from ESRI: | |||
http://www.arcgis.com/home/group.html?q=tags:ArcMap931_Base&t=group&owner=esri&title=ESRI%20Maps%20and%20Data&sortField=title&sortOrder=asc&content=all | |||
=== Notes === | |||
GeoPDF is a trademark of TerraGo. The generic term is "geospatial pdf". Please in your mind translate all references to GeoPDF in this document to "geospatial pdf". Thank you. | |||
== Reading an Excel spreadsheet == | |||
# Install python-excelerator | |||
# Look at examples: cd /usr/share/doc/python-excelerator/examples/python | |||
# Example: | |||
Convert | |||
/usr/share/doc/python-excelerator/examples/xls2html.py helicopt.xls >helicopt.html | |||
View result at: http://hupi.org/map/vietnam/helicopt.html | |||
Convert to csv | |||
/usr/share/doc/python-excelerator/examples/xls2html.py helicopt.xls >helicopt.csv | |||
Create database | |||
Load into database | |||
== Project staff == | |||
Gary Roush, project organizer. Some of the words (in third person WE) on this page are Gary's. | |||
[[User:Brian Wilson|Brian Wilson]] is providing GIS services as a volunteer -- I am not a vet of ANY war, just helping out! | |||
[[Category:GIS]] | |||
[[Category:OpenLayers]] | |||
Latest revision as of 05:42, 6 September 2015
Data sources
Ray's Map Room a collection of scanned topos
Place names from NIMA: ftp://ftp.nga.mil/pub2/gns_data/ vm, cb, la files
Converting a NIMA text file into geojson with a Python script
The vision
Build a web map server for the Vietnam Helicopter Pilots Association to share current and historic maps of Vietnam.
This is a first person project. The whole point is to involve directly the crews of the helicopters who served in Vietnam. We want to write our history accurately and not leave it to academics to re-write it the way they think it should have been.
The Vietnam Archives at Texas Tech University is our official archives and they have agreed to provide us with information to include on overlays.
We have database tables of points to add like helicopter crashes, ground casualties, battles, heliports, base camps, fire support bases, etc. Some of these would be date range specific. As example a specific fire support base or battle may have only existed from May 1968 until October 1968.
Each point can have back up material for pop up windows like an image of the approach plates (we have about 300 of those) for a specific heliport or picture of the crashed helicopter or details on the battle in ASCII or HTML or PDF. We are also thinking some video too. As an example, I have video of burning helicopters from a sapper attack http://www.242ashc.org/vhpa.mpg so an icon on a layer would be the window to see that video.
On top of this we would like for users to be able to add information, pictures and video either directly or through an administrator.
The implementation
Your input and comments are welcome. Either edit the page directly or use the discussion link.
Site will use UTM projection and have an MGRS grid system
The site will have base map layer created from Vietnam war era DMA (Defense Mapping Agency) maps that have been scanned and archived by USGS.
The project will be rolled out on a Godaddy server.
- We can use Openlayers to implement the browser portion.
- There can be several vector layers for the overlays,
- Tiled maps created from the DMA/USGS GeoPDF files,
- A Google Maps layer to add current air photos
The site needs user accounts so that we can create roles to control access.
- An admin user can edit any data.
- A confirmed user can edit only some of the data and can add comments.
- Anonymous users can only read and view site contents.
Server options
Vector layers - Vector layers (points, lines, polygons) could be served from either Mapserver or GeoServer.
Vector data will be stored in a PostGIS database for easy editing and attribution from the Web site. Table data can be stored in the same PostgreSQL server that serves the PostGIS data.
Tile layers -- The GeoPDF files are turned into tiles, The links at the top of this page just serve tiles stored an Apache server. Not sure what we should use ultimately.
Aerial photos -- current air photos are pulled in from Google maps.
- Create a list of what specific vector layers will be used.
Suggestions for layers
- There should be a layer that anyone can edit.
- There should probably be a layer created from archives.
Server software
Featureserver will allow editing vector layers. But it seems to be frozen in time, no updates in a long time. MapFish seems like a better candidate to me at the moment.
I have taken out Mapserver for a spin a few times, but it does not support WFS-T so it is not easy to get data from the browser back into the database.
GeoServer supports WFS-T, so I am going to try it. It requires Tomcat. Godaddy supports Tomcat. Hostmonster (where wildsong.biz lives) does not, so I will be installing it at Tektonic where hupi.org lives.
Obtain source maps
I need the raster maps.
If you already know the numbers of the maps you need,
- Go to USGS site http://store.usgs.gov/
- Click Advanced Search
- Enter map number in Old Material
- Click "Search"
- Click on the map image
- Save the PDF file
Samples for testing
- 1501ANC4810
- 1501ANC4811
- 1501ANC4815
You can also use the search features to find what you need.
For testing I am also going to get some vector data.
From GIS Data Depot: populated places, aeronautical, roads, railroads That should be enough for right now.
These files are in E00 format so I am going to describe loading them on this page: Loading data into PostGIS
Working with GeoPDFs
The GeoPDFs downloaded from USGS were scanned from paper maps, converted to a digital form (probably TIFF) and then georeferenced. Then they were converted to GeoPDF format using TerraGo Publisher.
Each scan covers only a relatively small area on the ground. So to create a complete seamless map for use on the web, we must
- Convert GeoPDFs to some format that we can further process.
- Remove the collars (the border containing legend, dates, etc).
- Process the set of resultant files into tiles that can be served.
Step one: Reading GeoPDF
If you are using ArcMap you can use "TerraGo Publisher for ArcGIS" ($2700) to import GeoPDFs directly. At this point, you can treat the GeoPDF data as you would any other ArcMap data source. Unfortunately $2700 is outside my budget.
Another excellent commercial product is Global Mapper (retails for $349). It can read the PDFs and export them as GeoTIFFs. An added plus, it has code to remove the collars and can also reproject the files if needed. I am told it is scriptable too.
You could ignore the geospatial encoding in the source PDFs and convert them into a format (TIFF for example) readable by ArcMap and then georeference it, but that's a lot of work given the PDF is already georeferenced!
The very latest version of GDAL (1.8.0) can read GeoPDFs. It's free and I have worked with it in the past, so I am giving it a try.
Recipe to convert a GeoPDF to a GeoTIFF using GDAL
% gdal_translate -of GTiff -co "TILED=YES" -co "TFW=YES" 1501ANC4815_geo.pdf 1501ANC4815_geo.tif
This generated a 47 MB TIFF that is 4470 x 3375 pixels.
You can learn more about GeoTIFF options here: http://www.gdal.org/frmt_gtiff.html A good one to use might be "-co "COMPRESS=LZW"
Step two: Removing collar
With (the Windows program) Global Mapper this is as simple as checking an option box for the layer. But it's a Windows-only program.
- Use gdalinfo to read the header to find the location of the collar.
- Use a script to find the corners (there can be many points defining the edges but I only want the corners)
- Convert the raster to a GCS using gdalwarp
- Convert the extracted corners to GCS using gdaltransform.
- Remove the collar using gdal_translate
Note that it is also possible to simply trim away using trial and error, for example,
% gdal_translate -srcwin 350 0 4120 2750 -of GTiff -co "TILED=YES" 1501ANC4815_geo.pdf 1501ANC4815_geo.tif
1 Detect collar using gdalinfo
According to the "Acrobat Supplement to the ISO 32000" there is a definition in the file to tell me where the map lives. The gdalinfo command reveals the NEATLINE setting:
NEATLINE=POLYGON (( 445035.80158801417565 999277.556482305750251, ul 444993.235997625160962 991902.349403560161591, 444864.738057852664497 884201.712978136376478, ll 452159.580137342331 884208.817744017927907, 585964.54743435990531 884289.527591207530349, 614300.221590773784555 884383.269830735283904, lr 614190.235319046885706 999293.477391553344205, ur 605175.140901734353974 999218.552309990860522, 553978.789515793323517 999300.979716640315019, 505958.457250406034291 999353.427943945163861, 451172.635888226970565 999382.750743734533899, 445035.722449889173731 999360.237513555795886, 445035.80158801417565 999277.556482305750251, 445035.80158801417565 999277.556482305750251, 445035.80158801417565 999277.556482305750251))
The corners of the image with collar included are given as:
Corner Coordinates: Upper Left ( 429999.053, 999725.915) (104d21'46.72"E, 9d 2'39.27"N) Lower Left ( 430135.804, 856853.093) (104d21'58.76"E, 7d45' 6.78"N) Upper Right ( 619287.590, 999910.271) (106d 5' 7.80"E, 9d 2'41.48"N) Lower Right ( 619424.341, 857037.449) (106d 4'59.36"E, 7d45' 9.52"N) Center ( 524711.697, 928381.682) (105d13'28.20"E, 8d23'57.68"N)
The corners correspond to the edges of the "paper". To visualize things, I put the neatline points and the corners into a text file, then loaded the map TIFF into QGIS, and used the "Add delimited text file" plugin to add the text file to the map. The points all fell onto the neatline.
2 Find the corners
I wrote a little Python to sort them. It looks like this.
#!/usr/bin/env python import math neatline = [ (445035.80158801417565, 999277.556482305750251), (444993.235997625160962, 991902.349403560161591), (444864.738057852664497, 884201.712978136376478), (452159.580137342331, 884208.817744017927907), (585964.54743435990531, 884289.527591207530349), (614300.221590773784555, 884383.269830735283904), (614190.235319046885706, 999293.477391553344205), (605175.140901734353974, 999218.552309990860522), (553978.789515793323517, 999300.979716640315019), (505958.457250406034291, 999353.427943945163861), (451172.635888226970565, 999382.750743734533899), (445035.722449889173731, 999360.237513555795886), (445035.80158801417565, 999277.556482305750251), (445035.80158801417565, 999277.556482305750251), (445035.80158801417565, 999277.556482305750251) ] UpperLeft =( 429999.053, 999725.915) #(104d21'46.72"E, 9d 2'39.27"N) LowerLeft =( 430135.804, 856853.093) #(104d21'58.76"E, 7d45' 6.78"N) UpperRight =( 619287.590, 999910.271) #(106d 5' 7.80"E, 9d 2'41.48"N) LowerRight =( 619424.341, 857037.449) #(106d 4'59.36"E, 7d45' 9.52"N) Size = (4470, 3375) ul = neatline[0] ur = neatline[0] ll = neatline[0] lr = neatline[0] def dist(p1, p2): dx = math.pow(p2[0] - p1[0], 2) dy = math.pow(p2[1] - p1[1], 2) return math.sqrt(dx + dy) def closer(corner, p1, p2): if dist(corner, p1) < dist(corner, p2): return p1 return p2 for n in neatline: ul = closer(UpperLeft, ul, n) ur = closer(UpperRight, ur, n) ll = closer(LowerLeft, ll, n) lr = closer(LowerRight, lr, n) fp = open("/home/bwilson/Downloads/nl.csv", "w") fp.write("x, y\n") fp.write("%f, %f\n" % ul) fp.write("%f, %f\n" % ur) fp.write("%f, %f\n" % ll) fp.write("%f, %f\n" % lr) fp.close()
It generates a file containing the neatline corners, which looks like this
x, y 445035.722450, 999360.237514 614190.235319, 999293.477392 444864.738058, 884201.712978 614300.221591, 884383.269831
3 Reproject the raster
Conveniently this command converts it from GeoPDF to a GeoTIFF at the same time.
gdalwarp -t_srs EPSG:4326 1501ANC4810_geo.pdf -of GTiff 1501ANC4810_4326.tif
4 Convert the extracted corners to GCS using gdaltransform
gdaltransform command
gdaltransform -s_srs EPSG:3148 -t_srs EPSG:4326 < nl.csv 104.496065478207 9.04334959786241 -12.9164893552661 106.035159775414 9.04163128037419 -24.1458563385531 104.495875664344 8.00173981131893 -17.4262739224359 106.033366794208 8.0023968002259 -28.6638822425157
5 Remove the collar using gdal_translate
Using map coords (using data from map): -projwin ulx uly lrx lry
% gdal_translate -projwin 104.496065478207 9.04334959786241 106.033366794208 8.0023968002259 -of GTiff -co "TILED=YES" 1501ANC4815_4326.tif 1501ANC4815_nocollar.tif
Put it into Google maps projection
gdalwarp -t_srs EPSG:900913 1501ANC4815_nocollar.tif -of GTiff 1501ANC4815_google.tif
I can then proof this by loading it into QGIS.. it works. Now I need to automate it, the above is a bit clumsy.
Step three: make tiles
Turning one TIFF into a map
% gdal2tiles.py -t Vietnam -g ABQIAAAAGRHc9gAG3W9UhaqQD0pvExSVvaIHsSWhBu9dnlvZR7nHQNV82hT1HFUt0HcccsqJ2Z2e8femz7wrsw -u http://wildsong.biz/map/ 1501ANC4811_4326.tif vietnam
This creates a set of PNG files and HTML files in the output directory "vietnam". I then copy them up to the server and you can see them at http://wildsong.biz/map/vietnam
But I want to have multiple input files. I can merge them and then tile the output file. I also want to zoom in more than allowed by default so I make more zoom levels.
% export GKEY="ABQIAAAAGRHc9gAG3W9UhaqQD0pvExSVvaIHsSWhBu9dnlvZR7nHQNV82hT1HFUt0HcccsqJ2Z2e8femz7wrsw" % export URL="http://wildsong.biz/map/" % gdal_merge.py -o vietnam.tif -of GTiff 1501ANC4810_4326.tif 1501ANC4811_4326.tif 1501ANC4815_4326.tif' % gdal2tiles.py -t "Vietnam" -g $GKEY -u $URL -z 7-14 vietnam.tif vietnam
More data
Additional map layers from ESRI: http://www.arcgis.com/home/group.html?q=tags:ArcMap931_Base&t=group&owner=esri&title=ESRI%20Maps%20and%20Data&sortField=title&sortOrder=asc&content=all
Notes
GeoPDF is a trademark of TerraGo. The generic term is "geospatial pdf". Please in your mind translate all references to GeoPDF in this document to "geospatial pdf". Thank you.
Reading an Excel spreadsheet
- Install python-excelerator
- Look at examples: cd /usr/share/doc/python-excelerator/examples/python
- Example:
Convert
/usr/share/doc/python-excelerator/examples/xls2html.py helicopt.xls >helicopt.html
View result at: http://hupi.org/map/vietnam/helicopt.html
Convert to csv
/usr/share/doc/python-excelerator/examples/xls2html.py helicopt.xls >helicopt.csv
Create database
Load into database
Project staff
Gary Roush, project organizer. Some of the words (in third person WE) on this page are Gary's.
Brian Wilson is providing GIS services as a volunteer -- I am not a vet of ANY war, just helping out!