Maps mean different things to different people. So what is a map?
My definition is simple: a map is an answer to a question.
There are three basic kinds of maps that answer three basic types of questions:
* The Location map answers the question, “Where am I?” * The Navigation map answers the question, “How do I get there?” * The Spatial Relationships map answers the question “How are these things related?”
It’s this third type of map—a map that helps in our understanding of spatial patterns and relationships—where we as GIS professionals spend most of our time. We work hard making our maps. Our maps can be beautiful works of art, but that’s not why we make them. We make them to answer a question, to solve a problem, and to advance our understanding. And therein lies the power of the map.
Even the best maps have no power by themselves; they just exist, like the maps you hang on your office wall, or the maps in the world atlas sitting on your bookshelf. But depending on how they are created, and how they are used, maps can have tremendous power.
For a map to become truly powerful requires two things. First, they need to tell a story. Second, they need to be put in people’s hands.
Introduction Sampling design is a critical part of any study involving modeling and estimation based on data that is sampled from natural resources or other phenomena occurring in the landscape. Statistical considerations related to sampling are part of a larger scenario involving theoretical knowledge, previously detected behavior and patterns of the phenomenon, costs, accessibility to sample sites, politics, and so forth. Thus, the sampling design algorithm should be flexible enough to accommodate external considerations in the design. Currently, ArcGIS offers a number of different methods which include, Create Random Points, Create Random Raster, Create Spatially Balanced Points and Densify Sampling Network geoprocessing tools. Some of these methods can be used to design a new monitoring network and others can be used to add or remove monitoring sites from an existing monitoring network.
In this blog we’ll use the Densify Sampling Network geoprocessing tool to identify locations for new rainfall monitoring sites for an area on the east coast of South Africa. We want to find out where to place additional monitoring sites so that the mean annual precipitation surface created by interpolation can be improved. One can randomly suggest new locations in areas that are void of monitoring sites, however, these might not be locations that will yield a more reliable output prediction surface.
The Densify Sampling Network tool requires an existing monitoring network with measurements at known locations. Prior to running the Densify Sampling Network tool one needs to create a kriging geostatistical layer which can be done via the interactive Geostatistical Wizard or the Empirical Bayesian Kriging geoprocessing tool (new in ArcGIS 10.1).
For this example our new site selection criteria will be the standard error of prediction surface, associated with the kriging layer, which will be used to assist in determining where to place the new monitoring sites. An optional weight raster can also be used to give additional preference to the new site locations, in our case it’ll be given a value of one in areas where monitoring sites can be placed and zero’s in the ocean where we do not want to place monitoring sites. Simplistically, the following is done when the tool is executed. The standard error of prediction surface and the weight raster are combined and the location with the largest value is deemed to be the location of the new site. A prediction, using the kriging layer, is made at this location and this value is included in the input feature class of existing monitoring sites and a new standard error of prediction surface is generated. This surface is then again combined with the weight raster to decide where the next location should be. This sequential process is repeated until the desired number of new monitoring sites has been created. An inhibition distance can also be used to ensure that new monitoring sites are not too close to one another. If a proposed new site falls within this inhibition distance the location of the next largest value from the combined standard error of predictions surface and weight raster is selected......
Collaboration makes over 225,000 maps available on-demand to help geoscientists quickly and accurately assess the opportunities and risks involved in oil & gas exploration
Elsevier, a world-leading provider of scientific, technical, and medical information products and services, and SEPM (Society for Sedimentary Geology) today announced the integration of more than 18,000 geological maps from SEPM into Elsevier's web-based research tool, Geofacets. First announced in April 2012, and now available to Geofacets users, the integration grants geoscientists access to scientific information that can provide key insight into the potential of regions for oil and gas exploration, allowing geoscientists to make predictions and guide exploration with greater confidence. As a result of adding maps from SEPM, Geofacets now houses over 225,000 maps providing essential information to geoscientists.
"Technology continues to revolutionize how geoscientists access and best use the information provided by bodies such as SEPM to solve the complex puzzles presented in oil and gas exploration," said Dr. Howard Harper, Executive Director, SEPM. "Our collaboration with Elsevier and Geofacets has made our publications even more accessible and applicable for geoscientists in the oil and gas industry and beyond. By providing crucial information on sedimentary rocks and strata in a clear and comprehensive manner, this collaboration provides the best possible outcome for everyone. Our own publications have greater exposure; Geofacets has a wider range of information to draw on; and geoscientists themselves can access and share, on-demand, even more of the information they need from a single source."....
The Minister of Transport and Public Works, Enrique Pintado, presented gvSIG Batoví, the first Uruguayan distribution that has given rise to gvSIG Educa, on Thursday, 6 September 2012.
gvSIG Educa is a customization of the gvSIG Desktop Open Source GIS, adapted as a tool for the education of issues that have a geographic component. The aim of gvSIG Educa is to provide educators with a tool that helps students to analyse and understand space, and which can be adapted to different levels or education systems. gvSIG Educa facilitates learning by letting students interact with the information, by adding a spatial component to the study of the material, and by facilitating the assimilation of concepts through visual tools such as thematic maps or helping to understand spatial relationships.
n this way, gvSIG Batoví is the beginning of an open source software that will be probably adapted and used in a lot of countries. gvSIG Batoví is a software promoted by the National Survey Department for CEIBAL Project, whereby primary and secondary education students will be able to access to educational information represented on maps.......
I’ve managed to get my hands on a Garmin fenix, the new backcountry sportswatch, and I’ve got a few pics to share. The fenix is already shipping from GPS City and REI; it looks like Amazon won’t have it for a few weeks yet.
On my first trip out, I just wanted to get a feel for it, but I did verify that it has advanced track navigation (the top field in the image above shows distance to the next waypoint along the track). The image below shows the active route and compass screens....
In last week’s blog posts I talked about several different ways of overlaying ESRI shapefile data onto Bing Maps. In this post, I will walk through how to develop a simple application for loading a locally store shapefile onto the Bing Maps WPF control using the ESRI Shapefile Reader CodePlex project.
Creating the project
First, you will need to download the ESRI Shapefile Reader project. Once this completes, you can unzip and run the Visual Studio project. You will notice there are two libraries in the project. The first is called Catfood.Shapefile; this is the main library that has the logic for reading and parsing Shapefiles. The second is just a basic application for reading some metadata from a shapefile. We are really only interested in the first project.
Open up Visual Studios and create a new WPF application call BingMapsShapefileViewer. Next, right click on the solution and add an Existing project. Locate and add the Catfood.Shapefile project. Next, right click on the References folder of the BingMapsShapefileViewer project and add a reference to the Catfood.Shapefile project. Your solution should look like this:....
A number of geoprocessing tools including Spatial Join (Analysis), Append (Management), Merge (Management), Feature Class To Feature Class (Conversion), and Table To Table (Conversion), have a parameter for controlling how fields from the input dataset(s) are processed and written, or mapped, to the output dataset – the Field Map parameter. In addition to the simple moving of attributes from input to output, field mapping can also be useful for some common tasks such as field concatenation and calculating statistics like mean, sum, and standard deviation.
If you haven’t used the Field Map before, you should! Understanding and using field mapping will often reduce the number of processing steps in a workflow, and ensure that, in any scenario, attributes are handled in an appropriate way. Yes, the Field Map parameter is a complicated one, but it is well worth the time it takes to figure it out.....
When writing a research paper or an article that contains references to GIS data, maps, or other geospatial material, it's important to include a proper citation crediting the author of the GIS work.
Citations vary depending on if the map is a single piece of work, part of a map series, an atlas, or a map that is part of a book or a journal article. There are even specific citations if the map was created using GIS software or you are citing GIS data. There are varying citation guidelines for static web maps versus dynamic online mapping applications.
For each map, first consult the original work in order to extract the necessary information. Scan the map for the necessary information. If some of the needed citation information is not listed directly on the map, access any available background information. If the map is found within a book, article, or atlas, look for any figures or footnotes that provide additional detail. If the map is accessed from a web page, check for any background information on the source web site. Make sure you carefully note within your citation any missing information.......
Python has been more tightly integrated in the new release of ArcGIS 10, allowing scripting to occur directly through a Python process without even opening up ArcMap. Admittedly this was available before, but now everything is more tightly coupled and a lot cleaner in it’s implementation. However, what has really interested, and indeed confused me of late is how to use Python in the ‘field calculator’.
Field Calculator is a really useful tool, when you are looking at an attribute table for a shapefile in ArcGIS and you want to derive a value for each object in the file based on a function you can input the function into the field calculator and it will work it out for you row by row. Sometimes the value you want to derive is a bit more complicated than simple arithmetic and you need to write a script. Previously you could do this in VBA, but I always found it limited and confusing, now however you can do it in Python – much simpler!
There are a few pitfalls to using Python in ArcGIS field calculator, and so I’m going to specify how to write simple field calculator python scripts in ArcGIS from my early experience.
Firstly, for Python in field calculator the way to do it seems to be in write a Python function, and then call it for each row. In addition to this, because you are writing a function you have to give it the relevant parameters (i.e fields) with which to do the computation. Finally, and annoyingly you have to write your function in a little box, and use a consistent indentation standard (1 space works best for reasons of space) as Python requires.
On Thursday afternoon 6 September 2012, the 150 seat Lecture Theatre 2 at Victoria University of Wellington’s Pipitea Campus was full to overflowing with policy analysts, people who work with statistics and spatial data, students, and people interested in cartography. They were there for a presentation by Dr Aileen Buckley which explored the design principles for statistical mapping.
There is a growing awareness of the value of viewing statistical data in map form. Being able to view and analyse statistical data based on location is becoming increasingly popular with decision-makers across central and local government. Promoting the use of such data fits with work under the NZ Geospatial Strategy to make our location-based information easy to find, share and use to deliver economic, social and cultural benefits to New Zealand.
I spoke with Rochelle Morgan (Geospatial Manager from Statistics NZ) who said that Dr Buckley’s presentation was timely and informative for attendees from Statistics NZ which is embarking on an agency-wide programme of spatial enablement. Statistics NZ will utilise the design principles presented by Dr Buckley as it aims to increasingly provide engaging and relevant maps to present and visualise statistical information.
A video of the presentation and related slides are now available at the links below.
Aligning features and editing coincident geometry is easier with ArcGIS 10.1. In a series of posts, I will be highlighting some of the new tools available to update features that share geometry. I am working with a set of layers representing land uses, hydrologic features, and administrative boundaries near Mammoth Cave National Park in Kentucky. I need to align some of the features to make them coincident and be able to edit several features simultaneously. The alignment tools in ArcGIS 10.1 make it simpler and quicker for me to perform these updates on my data.
I am first going to use the topology editing tools to update the features that currently share boundaries because topology allows me to update multiple features at once and ensure they stay coincident. As in previous releases, there are two kinds of topology in ArcGIS: map topology and geodatabase topology. Creating a map topology is quick and allows me to edit features that are coincident, which is all I need for my Mammoth Cave edits. A geodatabase topology requires more effort to set up and modify, since it provides rules that define complex relationships about how the features share geometry. Once the topology is set up, the editing tasks are similar regardless of whether a map topology or geodatabase topology is being used......
For more years than I can count, accessing water quality data has been a somewhat arduous task. Many different organizations have data. Unfortunately this data is usually in different formats and requires different methods to access. It is tough for a scientist to get the information they need, let alone for a school kid wanting to find out information about the creek down the hill.
It hasn’t received a lot of publicity but the US Geological Survey and the United States Environmental Protection Agency through a partnership with the National Water Quality Monitoring Council have brought the two biggest sources of water quality data, EPA’s Storage and Retrieval (STORET) system and USGS’s National Water Information System (NWIS) together into one place, the Water Quality Portal. In November of last year the system provided access to over 200 million records at over 5 million locations throughout the US.....
In 2010, ArcGIS Ideas user eatkinson posted an idea to enhance ArcGIS with the Ability to view and edit LIDAR (*.las) files directly. Within a few months, this idea became one of the most popular ideas on the site. In June 2012, the Esri 3D Product Manager posted a comment with the good news, informing the community that ArcGIS 10.1 will include native support for lidar data and more. User eatkinson asked for viewing and editing capabilities, but we went a little further and added functionality to use lidar data in analyses and publish as image services. In previous releases, lidar data in LAS format needed to be converted to a point and then, if necessary, to a raster dataset, TIN or Terrain. At ArcGIS 10.1, lidar data is available in a LAS file which can be managed, viewed, updated, and shared, all while remaining in its native format. There are two container objects that allow you to collect multiple LAS files and treat them as a single data unit—the mosaic dataset and the LAS dataset. In a mosaic dataset, you can add LAS files directly. The LAS dataset data model is new in ArcGIS 10.1. It provides vector-based access to LAS files. You can interact with a collection of LAS files either as individual points, or as a TIN-based surface. LAS datasets will allow you to:
*Do analyses. *View lidar points and surfaces in 3D. *Edit LAS files. *Publish as image services
Please watch the video, ”Native Support for Lidar,” in which Esri’s Nathan Shephard introduces new lidar visualization, analysis, and dissemination capabilities that are a core component of ArcGIS 10.1.
To learn more about lidar support at ArcGIS 10.1, visit this online help documentation: Learn about lidar support in ArcGIS......