PDF - GeoInformatics

Transcription

PDF - GeoInformatics
3D Laser Scanning for Heritage
Esri European Developer Summit
Mapillary and the Power of Community Mapping
The Future of City Asset Mapping
December 2015 – volume 18
8
www.geoinformatics.com
Magazine for Surveying, Mapping & GIS Professionals
NEW RIEGL
VMQ -450
®
Compact Single Scanner
Mobile Mapping System
360° vertical field of view
in a single pass
0°
15°
30°
45°
multiple swivel positions
various camera
options available
Compact single-scanner mobile mapping system,
ideally suited for a variety of applications.
The system comprises of the measuring head including one RIEGL VQ-450 laser scanner with integrated
IMU/GNSS unit, a compact control unit for system operation and data storage. The optional integration
of up to four cameras allows simultaneous acquisition of image data to complement LiDAR data.
RIEGL VMQ-450 Key Features:
compact and rugged design, flexible and rapid installation | 360° FoV (Field of View) | effective
measurement rate up to 550 kHz, 200 lines/sec | online waveform processing, echo digitization | multi-target
capability | optional integration of various cameras | multiple swivel positions of the measuring head |
seamless workflow from MLS data acquisition, through processing and adjustment
Scan this
QR code to
watch the new
VMQ-450 video.
Stay connected with RIEGL
www.riegl.com
RIEGL Laser Measurement Systems GmbH, Austria
RIEGL USA Inc.
RIEGL Japan Ltd.
RIEGL China L
Ltd.
Editorial
On “Smart Cities”
“Smart cities” is a new term which refers to a concept that is anything but new; science fiction writers have been imagining what kind of life mankind
might have in cities of the future since the 1950s. Thinking about how life in big cities will evolve is a natural and logical thing to do, as more and more
people occupy the city spaces and live in them at a faster pace than ever before. This year, the term “smart city” made it big - worldwide. Politicians
and policy makers everywhere were intent on becoming a sort of research laboratory where tech companies would be given “carte blanche” and
would do whatever they want. Or so it seems.
Almost every day I read about new initiatives regarding smart cities: new concepts, new technology, new initiatives, and awards, etcetera. As this magazine goes to print, the Smart City Expo in Barcelona will be underway and that’s just one example of an increasing number of events arousing great
interest and indicating how popular this concept is. The event web page offers a glimpse of the different stakeholders of the “smart city” concept, as
it brings together “[…] global, national, and regional urban representatives, thought leaders, academic institutions, research centers, incubators,
investors, and top corporations that have the kind of decision-making power that drives smart cities and will empower its citizens.”
All these stakeholders will undoubtedly bring something unique into the mix, which is, of course, a good thing. And although I don´t doubt their good
intentions to learn from others and the combined expertise of all parties involved, I foresee a few bumps on the road: firstly, the concept of “smart cities”
has more than one meaning. In fact, it is used more as a marketing term than something academic. Some emphasize the technological
component, while others emphasize the urban planning concepts that underline the concept. Others may point out that smart cities are
about people and the livability of cities and that technology is a means rather than an end. Who is right?
Secondly, there is no umbrella organization or initiative to monitor smart cities. I´m not advocating there should be one, but the danger is that everyone is inventing their own wheel, with the intention of becoming someone else´s ideal example, although we are
all aware that no two cities are the same – a general rule in geography. This is pointed out by Mike Barlow in “Smart Cities,
Smarter Citizens”, a recommended literature study on the subject.
Barlow also points out two other dichotomies that exist in smart city initiatives: while policy makers think and act in topdown initiatives, they ignore that bottom-up, citizen-driven initiatives are equally important in creating a livable city (such
as citizens organizing a marathon themselves, instead of a city council which gave little thought to the matter other than providing the infrastructure). Lastly, the emphasis on connected technology in the everyday lives of citizens (through apps, interconnected devices and the cloud) might reduce personal contact and communication between people, thus reducing livability
and resulting in isolated neighborhoods or individuals, which is exactly the opposite of what “smart cities” intend to achieve.
The situation may get to this stage before we stop and consider the implications of what we are doing: after the hype
comes reflection. Some critical minds have already pointed out that tech companies which approach cities hungry
to become the next “smart city” have very different interests than city officials and might not act in the people’s interests. They may, for example use technology as a means to obtain private data. A smart city initiative in South Korea
showed that city planners can indeed design a city that looks perfect on paper, but the proof of the pudding will be
in the eating. What will the reality turn out to be? It’s the citizens who will have the last word.
Enjoy your reading,
Eric van Rees
GeoInformatics is the leading publication for Geospatial Professionals worldwide. Published in both hardcopy and digital, GeoInformatics provides coverage, analysis and commentary with respect to the international surveying,
mapping and GIS industry. GeoInformatics is published 8 times a year.
Publishing Company:
CMedia BV
Editor-in-chief:
Eric van Rees
[email protected]
Copy Editor:
Elaine Eisma
Editor:
Remco Takken
[email protected]
Photography: www.bestpictures.nl
Contributing Writers:
Martin Schwall, Benjamin Busse, Paul Bryan, Georg Hammerer, Thierry de
Tombeur, Peter Neubauer, Remco Takken, John Stenmark, Georg Hammerer,
Daniel Maurice, Marie-Caroline Rondeau.
Graphic Design:
Sander van der Kolk
[email protected]
ISSN 13870858
Columnist:
Matt Sheehan
Advertising:
Yvonne Groenhof
[email protected]
Finance:
[email protected]
Subscriptions:
GeoInformatics is available against a yearly subscription rate (8 issues) of
€ 89,00. To subscribe, fill in and return the electronic reply card on our
website www.geoinformatics.com
Website:
www.geoinformatics.com
© Copyright 2015. GeoInformatics: no material may be
reproduced without written permission.
P.O. Box 231
8300 AE
Emmeloord
The Netherlands
Tel.: +31 (0) 527 619 000
E-mail: [email protected]
GeoInformatics has a collaboration with the
Council of European Geodetic Surveyors (CLGE)
whereby all individual members of every
national Geodetic association in Europe will
receive the magazine.
Content
On the cover:
Orthophoto of Tiengen (source: IngenieurTeam GEO GmbH).
See article on page
6
Articles
Building Cities Using UAV
4
6
3D Laser Scanning for Heritage
10
The Hexagon Smart M.App
14
SlopeManagement via Satellite Navigation
16
Creating a Central Source of Geospatial Truth
18
Setting the Standard in City Digitalisation
20
Pau River Flood Mapping
22
Mapillary and the Power of Community Mapping
24
Driving EU Agricultural Payments in Austria
27
Urgent Action in the Himalaya
34
Book review
Learning Geospatial Analysis with Python
45
Column
Our GIS Emphasis Should be Business Outcomes not Maps
39
Events
Esri European Developer Summit 2015
28
Esri European User Conference
30
Geodesign Summit Salzburg
32
Interview
Location and Interoperability
40
Newsletters
CLGE
42
Calendar / Advertisers Index
46
December 2015
10
Article
Located in the historic city of York in northern
England the Geospatial Imaging team of Historic
England carry out metric surveys of historic buildings, sites and landscapes using a range of surveying technologies. These include 3D laser scanning that the team have been using since 1999
and applying across a range of heritage sites such
as Stonehenge.
16
Article
Every skiing season, thousands of cubic metres
of snow are produced every day and distributed
by snow groomers. If the drivers have to rely on
their “feeling”, this can result in uneven snow
depths. The Austrian company PowerGIS GmbH
has solved this problem with the ARENA SlopeManagement system: modern data networking
ensures an optimal and even depth of the existing snow.
20
Article
Roads. Building facades. Road signs. These are
just a few of the assets surveying firm Cabinet
Brière needed to map in the towns of Alès and
Gap to provide a 3D digital map of the communes
in southern France. Needing to better understand,
operate and manage the country’s electricity
distribution network.
32
28
The Esri European Developer Summit in Berlin
offered three days of tech session on Esri’s latest
geospatial technological developments. Some of the
take-aways included the new ArcGIS JavaScript
API, cross-platform development using Xamarin, and
Quartz, Esri’s next-generation version of ArcGIS
Runtime.
Event
Event
In short, Geodesign allows GIS to integrate into the
planning process. In order to ensure this is done in
a successful way, a framework is needed and, most
importantly, the willingness of those involved to
collaborate with other disciplines; it is important
to discuss and to listen.
The twin cities of Waldshut-Tiengen, Germany, have approximately 22,000 inhabitants and are located
on the beautiful Upper Rhine in Baden-Württemberg near the Swiss border. The local municipal planning
and building control office needed an up-to-date planning framework for the newly planned city
construction projects and urban development in the two districts. After being reviewed, the documents
were simply not accurate enough, even though they had been existed in 3D format. This was due to an
insufficient level of detailing, also referred to as LoDs (Level of Detail). A block model (LoD 1) and
standardised roof shapes (LoD 2) were insufficient. The municipal planning and building control office
needed a more accurate 3D model as a basis. How could the exact data of complex roof shapes be
obtained without the risk of walking on steep roofs or being forced to hire a company to carry out
aerial photography by airplane or helicopter?
Article
By Martin Schwall and Benjamin Busse
6
A Complement to Classic Measurement Methods
Building Cities Using UAV
December 2015
Point cloud with design Point cloud with design (source: IngenieurTeam GEO GmbH)
T
he IngenieurTeam GEO GmbH has
successfully employed an Aibot X6
from Aibotix for several years and
was commissioned by the city of
Waldshut-Tiengen with UAV flights
for aerial surveying and data processing to
create geo-referenced ortho-photos, coloured
point clouds and 3D models. The project included obtaining the roof geometries as well as
eave and ridge heights. The data obtained
was then further processed by Autodesk programs – right up to 3D modelling.
Providing a better basis for decision-makers with 3D modelling
The municipal planning and building control office sought to create an accurate 3D
model to visualise the striking new building
projects in the centre of Tiengen and in
areas of the city Waldshut prior to construction planning. In contrast to plans presented
on paper, 3D modelling has the power to
truly depict reality and convince the public
and the decision makers.
comprising Waldshut and Tiengen were carried out in May by employee and certified
Aibot pilot Benjamin Busse. Aerial surveying
of the inner-city requires a special permit and
adherence to certain requirements.
The raw data was evaluated with Agisoft
Photoscan Pro software. Prior to the aerial survey flights and for the subsequent calculations
required to ensure accurate results, control
points were measured, marked and signalled
using a Leica TPS1200 total station and a Leica
Viva GNSS system. Adhering to the principle of
surveying “no measurement without control”
and to ensure consistent position and height
data within the range of ± 5 centimetres (± 2
inches), control measurements were also carried out using the total station at individual buildings during establishment of the control points.
IngenieurTeam GEO’s aerial surveying services were already well known. Thus, the idea
for using an ortho-photo to supplement existing LGL (State Agency for Spatial Information)
data with current aerial survey data to increase accuracy was the next logical step.
The Aibot X6 UAV system offered many
advantages during this project. The system’s
quick implementation and high resolution
16.2 megapixel images delivered the accuracy required for the project. An exact recording
of the roof types, ridge heights and eave
heights would not have been possible with
conventional measuring methods due to the
close proximity of the buildings in the affected
areas. The roof ridges would not be visible
due to excessively steep sights.
Convenient and secure data capturing
A picture is worth a 1,000 words
The flights for the aerial survey of the 70,000
square metres (753,500 square feet) of land
The real benefit, however, lies in the data gathered. This data provides the client with a
high level of added value, a textured 3D
model, a coloured point cloud and ortho-photos with ground resolution of 1 centimetre
(0.4 inch) for optimal representation of the
planning area.
Partner company Bytes & Building GmbH,
which advises the town of Waldshut in the
area of AutoCAD systems, were responsible
for the visualisation. Bytes & Building GmbH
provides comprehensive solutions in the fields
of architecture, building and infrastructure
and occupies a leading position in Germany
in the construction industry and building information modelling (BIM). When Bytes &
Building submitted the 3D models and a 3D
animation of city areas there was nothing but
praise and enthusiasm from the head of the
Civil Engineering Office, Uwe Kopf.
“We are thrilled! The data and information
provided have literally given the planning
and control office a whole new perspective,”
explained Kopf. “Since the visualisation goes
beyond 2D floor plans and 2D building facades, the current high-resolution ortho-photos
and 3D modelling have immense value for us.
This greatly simplifies the decision making
process further down the road.”
The perfect complement to classic measurement methods
The use of the UAV system and more than 30
projects implemented by the IngenieurTeam
GEO GmbH prove that the UAV system delivers excellent results for surveying in the area
December 2015
7
Article
8
Client requirement fulfilled: exact geometries of the roofscapes, captured with the Aibot X6
(source: IngenieurTeam GEO GmbH)
of engineering services. The as-built documentation and the creation of ortho-photos as
seen in this project are merely two possible
areas of application. Other applications are
quite possible in the future. For example, building and property inspections as well as
large-scale monitoring and inspections are
also conceivable. It is also possible to carry
out flights in GNSS-denied areas for aerial
surveys, such as in large halls.
Similar to laser scanning, the processing of
data and the resulting point cloud allow for
diverse finishing processes and optimally complements classic methods of surveying.
Due to substantially improved software programs in the field of photogrammetry like
Agisoft Photo Pro, one can certainly speak of
a renaissance in the area of photogrammetry.
Large amounts of data can be collected and
analysed in a short time, quickly delivering 3D
data to the client and simplifying the decisionmaking process much more effectively than a
large piece of paper with plotted content,
which gives only a crude depiction of what
actually exists on the ground.
Despite upgrades in computing capacity, such
as larger memory and powerful graphic
cards, the processing of very large amounts of
Level of Detail (LoD)
Level of Detail (LoD) refers to the various levels of detail in the presentation of virtual
worlds. LoD concepts are also used in 3D landscape and city models. Depending on
the application, different levels of detail are required.
The City Geography Markup Language (CityGML) is an application schema for
storing and sharing virtual 3D city models. CityGML has been a standard for the
Open Geospatial Consortium (OGC) since August 2008 and is the basis for many
city models in Germany.
data is currently a problem due to longer processing times. The quantity of data and dimensions submitted to the customer should be discussed in advance and pre-established as
much as possible. Ultimately, the end customer
and user must be able to use their data according to their needs and applications.
In future projects, the use of UAV and photogrammetric data processing for obtaining 3D
data will efficiently and effectively complement
traditional methods. The rapid development of
UAV systems is supported by the principles of
surveying and geoinformatics.
As professionals in the geo-industry and in
order to provide clients with maximum added
value, surveyors are committed to provide
clients with the best advice possible, to perform work with the highest-quality sensors and
methods, and to optimally analyse and to refine the data.
Martin Schwall, [email protected], is a Graduate Surveyor
(FH) and Managing Partner of IngenieurTeam GEO GmbH.
Benjamin Busse, [email protected], is B.Sc. Cartography and
Geomatics, UAV Technical Project Manager,
IngenieurTeam GEO GmbH.
The following levels of detail have been specified for CityGML:
LOD
LOD
LOD
LOD
LOD
0:
1:
2:
3:
4:
Regional model, 2.5-D footprints
Block model, building block (extruded footprints)
3D model with standard roof structures and simple textures
Detailed (architectural) building models
LoD 3 building models with interior features
December 2015
This article was reprinted from Leica Reporter 73 with
kind permission.
Experience and Practical Insights
3D Laser Scanning
for Heritage
Located in the historic city of York in northern
England the Geospatial Imaging team of
Historic England carry out metric surveys of
historic buildings, sites and landscapes using a
range of surveying technologies. These include
3D laser scanning that the team have been
using since 1999 and applying across a range
of heritage sites such as Stonehenge.
Article
By Paul Bryan
Historic England
Created in April 2015, Historic England is the English government’s independent expert advisory service for the historic environment providing expert advice about it, helping people protect it
and care for it. These roles were previously undertaken by English
Heritage (EH) but following its split in April 2015 EH is now an
independent charity that looks after the National Heritage
Collection of more than 400 historic properties including world
famous landmarks such as Stonehenge, the best-known prehistoric
monument in Europe and one of the wonders of the world.
Originally established in the 1980’s the Geospatial Imaging team
now forms part of the Remote Sensing Team of Historic England
that is ‘revealing the past, looking to the future’ by undertaking
aerial reconnaissance, mapping and investigation; earth observation using satellite, airborne and UAV platforms; geophysics, photogrammetry and laser scanning across the historic environment.
10
The survey technologies used by Historic England
Figure 1: Laser scanning at Richmond Castle, North Yorkshire using the Leica P40 (Image by Paul
Bryan © Historic England).
December 2015
These days there are numerous survey technologies that can
potentially be applied to any heritage object. However as
there is still no ‘magic bullet’ survey technology or ‘one size
fits all’ approach to surveying the skill is often in knowing
which technique is most appropriate for any given project. Photogrammetry has been used across architectural
applications since the mid 1850’s and has therefore
matured into a suitable approach for most heritage
applications and one English Heritage have been
using since its’ creation in 1983. In comparison laser
scanning is a lot younger, having only appeared in
UK terrestrial markets during the 1990’s, and hence
the technology still appears to be developing into a
similarly mature technology. However it already
has a number of advantages that contribute to its
suitability for heritage survey:
• Applicable on all 2D and 3D surfaces;
• Very fast data collection – over
1,000,000 points per second now achievable directly ‘in the field’;
• High resolution data capture – sub-mm
range noise over shorter distances and millimetric point spacing;
• Digital image integration with point data using on-board sensor, external DSLR or HDR
panoramic cameras;
• Laser intensity recorded - enables low-light
data capture and surface characterisation
potential.
That said achieving the right survey for the right
job is very important so any potential application must also consider the current disadvantages to ensure the resultant data is fit for purpose:
• Scanners are still expensive acquisitions –
typically between £30K (€40K) - £60K
(€80K) for terrestrial laser scanners;
• Scanning generates very large data files often difficult for the end-user to view,
manage and archive without access to highend computers, high capacity storage systems and dedicated software viewers;
• Useable outputs require dedicated post-processing software –
often too expensive and too complex for many end-user heritage
clients to consider acquiring and undertaking themselves;
• Line drawings still require manual digitisation – automated feature
extraction is still not at a useable level across all heritage applications often requiring more post-editing resource than manual digitisation;
• Complimentary to or now competing with Structure from Motion
(SfM)? – SfM uses cheaper camera hardware and post-processing
software but can now generate equivalent data densities and
accuracies to laser scanning.
How is the data obtained and what standards are used?
The Geospatial Imaging team currently employs only three staff which
means the majority of survey requirements for both Historic England and
English Heritage are undertaken through commercial contractors,
managed using an EU-compliant Framework Agreement. The current
agreement for 2015-2018 incorporates four survey groups – imagebased, low level aerial photography using Unmanned Aerial Vehicles
(UAV), topographic and measured building survey – that all require suitable specification to ensure appropriate data and useable outputs are
commercially procured. English Heritage published the first edition of its
“Metric Survey Specifications for Cultural Heritage” in 2000 which
back then provided a guide to the user & supplier of metric survey data
as well as explaining the services expected & performance indicators to
ensure the successful management of metric survey projects. Since the
publication of the second edition in 2009 there have been a number of
significant advances in surveying technologies that demanded further
consideration including the widespread use of laser scanning for measured building survey, the ubiquity of digital cameras, developments in
photogrammetric software that enable the use of non-metric cameras,
the use of UAV’s to capture low-level aerial photography, the adoption
of Building Information Modelling (BIM) and the increased generation of
the required datasets using laser-scanning technology. All of these have
been included within the third edition of the specification published in
September 2015 by Historic England and now available at (1).
The laser scanning section now includes the use of both targeted and
‘cloud to cloud’ registration approaches, the capture of High Dynamic
Range (HDR) imagery alongside the point data, and adoption of E57 as
the preferred non-proprietary format for archiving laser scanned datas-
Figure 2: Metric Survey Specifications for Cultural Heritage published by
Historic England in September 2015.
ets. However in order to achieve the best records
for digital dissemination a suitable level of metadata must also be collated to enable future re-use
and regeneration of such digital records deposited in the archive. In the case of laser scanning
some current metadata standards are cumbersome and time-consuming to collate that jointly limit
their ‘real-world’ application. The new specification now requires a minimum level of metadata to
be provided in digital form however it also recognises that many of the parameters that might be
required by the archive industry might already be
contained within the scan data file itself. Therefore
both Historic England and Historic Environment Scotland, with whom
we are working on deriving a suitable metadata standard, would welcome input from all scanner manufacturers and ideally the inclusion of a
metadata extraction tool within their scan processing softwares.
Laser scanning has multiple heritage applications including:
• recording, documentation and archaeological analysis;
• conservation planning and works specification;
• condition assessment and monitoring programmes;
• illustration & presentation using static and virtual display;
• website and app development;
• Building Information Modelling (BIM).
BIM & Heritage
The UK Government Construction Strategy was published on 31 May
2011 and announced its intention for “collaborative 3D BIM (with all project and asset information, documentation and data being electronic) on its
projects by 2016”. Most focus remains on new-build construction meaning
that adoption of BIM for existing buildings is still unclear. Therefore English
Heritage started consideration of BIM in 2013 by inclusion within its
Heritage Science strategy (2) and establishing its own internal BIM Special
Interest Group (BIMSIG) to assess the relevance and potential adoption of
BIM across its own historic estate and the impact of BIM on its external advice. Although project application of BIM across the UK heritage sector still
remains low such background work has allowed English Heritage and
Historic England to increase their knowledge on BIM and formulate some
research proposals to widen appreciation of its respective benefits across a
heritage context and target some areas of concern, see table 1.
Table 1.
Stonehenge laser scan survey
The Stonehenge laser scan survey undertaken back in 2011 successfully demonstrates the recording, documentation and archaeological
December 2015
11
Article
12
analysis application of laser scanning as well as its latent potential
for deriving new data. Funded by English Heritage and undertaken
on their behalf by the Greenhatch Group, a commercial survey company based near Derby in central England, this new survey aimed to
record both the world famous prehistoric monument and ‘The
Triangle’ landscape immediately surrounding it by applying a range
of laser scanning systems from Leica Geosystems and Zoller und
Fröhlich (Z+F) with varying specifications and data capture capabilities.
Based on the ‘Metric Survey Specifications for Cultural Heritage’ the outputs from this survey included:
• Raw and registered point clouds supplied in both proprietary and
ASCII (XYZ) formats;
• Digital surface models supplied in OBJ and PDF-3D format;
• ‘The Triangle’ landscape @ 100mm;
• Bank and ditch @ 20mm resolution;
• Stone circle @ 1mm resolution;
• All stone surfaces @ 0.5mm resolution.
• Animations in AVI format;
• Truview version of all scan data.
As a means of potentially extracting additional archaeological information for the site a range of analysis & visualisation methods were later
applied to the data in 2012 by ArcHeritage, a Sheffield based archaeological unit located in northern England. These included:
• Visualising the 1mm data within a 3D environment to provide the
base template for the documentation outputs;
• Examining the 1mm mesh data within a virtual environment and
experimenting with different textures, lighting techniques and shadow decay values;
• Examining the interaction of light & shadow within the 0.5mm mesh
data by applying polynomial texture mapping approaches;
• Using a custom developed Luminance Lensing approach to examine
luminance values within the 0.5mm mesh data;
• Using the plane shading function within Pointools that moves a greyscale band through the 0.5mm mesh data to reveal subtle changes
across the stone surface;
This analysis was extremely successful in revealing new archaeological
information on the monument including:
• Significant differences in how the stones were shaped and worked;
• Demonstrates Stonehenge was not only aligned with the solstices
December 2015
Figure 4: Stand in the Stones visitor experience within the new Stonehenge visitor centre
(Image by Paul Bryan © Historic England).
but the view from the Avenue, the ancient processional route to NE,
was important to its creators;
• The stones on the solstitial axis were most carefully shaped and dressed to provide a ‘dramatic’ passage of light through the circle during
solstices;
• Many new prehistoric carvings discovered including 71 new axeheads, raising the number in Britain from 44 to 115, and doubling
the number of early Bronze-Age axe-head carvings in Britain.
In December 2013 a new visitor centre was opened at Stonehenge containing a number of displays based on the laser scan data. These included interpretation and tactile reconstructions of the henge monument
and a new ‘Stand in the Stones’ virtual display that every visitor now
experiences when entering the new centre.
Such a project therefore demonstrates that laser scanning can successfully record heritage sites and monuments and provides a range of useable
outputs encompassing traditional, modern and virtual requirements. To
justify the resource commitment & financial expense such projects inevitably require, laser scanning should ideally be incorporated within a
research strategy and undertaken to a defined specification standard to
ensure outputs are fit for purpose. However it also showed that laser
scan datasets can contain additional information that may be later
unlocked using new analytical and visualisation tools highlighting the
need to consider the storage, archiving and metadata requirements at
the outset of every project and not just at the end as might usually occur.
The fusion of point and image-based technologies is occurring with
modern laser scanners now incorporating new imaging
(HDR) sensors that greatly improve on-board image
quality, and adoption of photogrammetric
based approaches within scan processing
software that improve scan
registration procedures. However with the increasing
application of Structure
from Motion and multiimage photogrammetric
approaches across a
variety of applications
areas, not least heritage, does this pose a
threat to the continued
dominance of laser
scanning or will the
industry eventually
see a true fusion of
technologies within
future hardware developments – only time will
tell!
13
Figure 3: Laser scanning at Stonehenge using
the Leica C10” (Image by James O. Davies
© Historic England)
Paul Bryan BSc FRICS, [email protected].
Internet: https://historicengland.org.uk/research/approaches/research-methods/terrestrial-remote-sensing/specialist-survey-techniques
Paul is the Geospatial Imaging Manager within the Remote Sensing Team of Historic England. Based in York he heads up the Geospatial
Imaging team which carries out metric surveys of historic buildings, sites and landscapes using laser scanning, photogrammetry and
multi-image based survey approaches. He also advises the external sector on the heritage application of RPAS/UAV/drone platforms and
BIM (Building Information Modelling).
(1): www.historicengland.org.uk/images-books/publications/metric-survey-specifications-cultural-heritage
(2): https://historicengland.org.uk/research/approaches/research-principles/research-strategies/Heritage-science-strategy
December 2015
Introducing the Map of the Future
The Hexagon Smart M.App
Article
By Georg Hammerer
Life changes quickly and the world changes with it. Traditionally, things
are frozen in time, recorded on static maps. Later, critical business and
governmental decisions are made and carried out based on these maps.
Because the data is a snapshot of the past, these decisions are not
based on current reality. Today’s maps are taking on a different perspective and reflecting a different way of thinking. Certainly there is
still a great need for maps, but there is also an ongoing revolution. We
live in a dynamic era in which decisions—and their repercussions—can
change course in seconds. Shouldn’t the map update as quickly?
UK. These content providers offer current,
reliable data, such as daily satellite imagery
and frequently updated geocoded addresses. This data can be steadily fused into a
Hexagon Smart M.App to build a dynamic
application that can be rapidly refreshed
with present day information.
14
Three Traditional Obstacles
Mapping geography with a high degree of
spatial accuracy has been done well.
However mapping geography with temporal
currency is more difficult. It requires a lot of
monetary and personnel resources to generate new maps with any sort of frequency.
Traditionally, to build a geospatial application requires the developer to surmount three
hurdles.
It begins with the time-consuming chore of
obtaining current, relevant data. The second
requirement is finding the right software for
GIS-trained personnel to fuse the data with
predictive analytics, and any other necessary inputs. Finally, you must generate a coherent, well-designed, and easily-understood
December 2015
presentation. To the geospatial community,
smoothing out the process by synchronizing
its three parts has been the Holy Grail of
mapping. Everyone knows what they are
looking for, but no one has been able to
achieve it.
The Map of the Future
The Hexagon Smart M.App surmounts each
of these obstacles by providing the means to
access a steady stream of data, quickly
build sophisticated applications, and give
meaningful form to the data through intuitive
analytics.
To overcome the first obstacle, Hexagon
Geospatial formed data partnerships with
world-renowned organizations like Airbus
Defence and Space and Ordnance Survey
The second challenge requires finding the
right tools and the people with the right business knowledge to build smarter applications. This can be solved by using the
M.App Portfolio, which offers the cloudbased components, workroom, content and
store for building, publishing and distributing a Hexagon Smart M.App. In this webbased platform, developers access a broad
set of technology, analytics and workflows
through an accessible, intuitive, and easilycustomized platform. This full integration of
data and software empowers organizations
to generate applications that ensure a constant flow of information, offering the clearest possible picture in real-time.
The third need was to revolutionize the presentation of data. Traditionally, the focus
has been on the map. But business decisions
are made based on statistics, metrics and
key data. Maps are still important for visualization and as a reference point, but the ability to present the map alongside summary
metrics in a graphical, interactive dashboard provides the clearest understanding
and true business insights.
Local Experts, Global Solutions
The value of the Hexagon Smart M.App ecosystem is that it leverages a network of partners to create them. By enlisting and partnering with local organizations, we can tap
into people who have the business-savvy
and regional understanding to build the specific Hexagon Smart M.App that is needed
by the industries they serve. This ensures that
the proper solutions are created and that
they answer specific, focused business
needs. Once created, partners can sell the
Hexagon Smart M.App in the M.App
Exchange, offering their ground-breaking
solution to a global audience.
The Hexagon Smart M.App makes it simple
and seamless to add timely data, focused
analytics, business workflows, and intuitive
presentation into a dynamic vehicle that can
analyze action to drive reaction—or predictive action.
their domain expertise— enables building
smart, simple solutions can that foster a new
decision-making paradigm. With the
Hexagon Smart M.App, decisions can be
parameter-based on near-real-time content,
allowing you to react to changing situations
as they are revealed by the map’s data flow.
With a smarter application, leaders can
more easily see when to say yes, when to
delay, and when to adapt.
The Map of the Future is here today. The
geospatial community finally has the tools to
make it happen through the Hexagon Smart
M.App and M.App Portfolio. This is an invitation to create new, game-changing applications that revolutionize the geospatial
industry.
Georg Hammerer is the Chief Sales Officer for Hexagon Geospatial,
leading the sales team and business development efforts worldwide.
Upgrade Decision-Making
All of this—the connection to near-real-time
data, sophisticated tools, access to an arsenal of analytics, and utilizing partners for
15
Video screenshot
December 2015
Ensuring Optimal Snow Depth
Article
SlopeManagement via Satellite Navigation
16
By Daniel Maurice
Every skiing season, thousands of cubic metres of snow are produced
every day and distributed by snow groomers. If the drivers have to rely
on their “feeling”, this can result in uneven snow depths. The Austrian
company PowerGIS GmbH has solved this problem with the ARENA
SlopeManagement system: modern data networking ensures an optimal
and even depth of the existing snow. This also prevents the excess production of artificial snow, which saves resources and costs. Radio modems
from the Finnish radio data communication expert Satel are essential technological components of the ARENA SlopeManagement system.
A
skiing holiday is generally not
inexpensive, which makes it all
the more annoying if there is not
enough snow on the slopes. In
times of global warming, on the
other hand, the production of too much artificial snow should also be avoided, in addition
to the fact that this process is extremely costly
for the slope operators. “For this reason, there
is actually no longer an alternative to
SlopeManagement with snow depth measurement,” explains Christoph Schmuck, Director
of Project Development at PowerGIS GmbH.
“Efficient processing of the artificial snow is
guaranteed only with the right technology.”
The solution from the Salzburg-based company has proven to be quite successful: the
ARENA SlopeManagement system is currently
in use in more than 25 ski regions – from
Kitzbühel to Ischgl, St. Anton, Lenzerheide and
Saas Fee. Altogether, more than 200 snow
December 2015
groomers are in operation with precise snow
depth measurement.
Data exchange in real time
The ARENA snow depth measurement system
is based on the GNSS technologies GPS and
Glonass. The basis for the data is a digital
model of the terrain, which is created by precise measurement of the slopes using 3D scanners during the summer. On the basis of this
precise topological map of the terrain it is possible to calculate exactly how deep the snow is
at a given location: the difference between the
current height of the snow groomer and the
height of the original terrain is the actual snow
depth. Visibility is no longer a factor – a crucial
advantage, since the slopes are generally prepared at night or during heavy snow flurries.
Precise measurement of the snow depth requires data exchange in real time. To achieve
this, the slopes and valleys are equipped with
a comprehensive network of base stations.
“The base station serves as a central reference
station in a reliable DGPS system,” Christoph
Schmuck explains. “The calculated correction
signal from the reference station allows us to
achieve high precision.” The base station is set
up at a precisely defined location in the ski
region. It continuously sends the correction signal to the snow groomers via radio data transmission.
For remote and hard-to-access slopes, additional repeaters are used that send the DGPS
signals up the steepest slopes and down the
deepest ravines. Depending on the terrain and
the extent of the ski region, more repeaters are
installed to achieve optimal system availability. “The Ischgl ski region is a good example,”
explains Christoph Schmuck. “It has an area
of about 400 hectares, which we can optimally cover with a base station and two repeaters.”
Figure 1: The SlopeManagement system with snow depth measurement prevents the production of too much artificial snow –
which saves costs and resources.(Photo: PowerGIS GmbH)
Figure 2: On the basis of a topological map of
the terrain it is possible to calculate exactly
how deep the snow is at a given location.
(Photo: PowerGIS GmbH)
The basis: long-range modems
Reliable transmission of the DGPS data is
achieved by radio data modems from the
Finnish manufacturer Satel that are available
in German-speaking countries from the exclusive partner Welotec. The modems were used
in 2006 in a pilot project in Schladming on
the Planai – and have proven their capability
ever since. The joint services of the partners
Satel and Welotec are also an advantage for
PowerGIS: “We were very satisfied, for example, with the special training for our specific
application,” says Christoph Schmuck.
“Welotec and Satel prepared and implemented the training jointly and resolved the open
issues promptly.”
UHF modems of the type Satelline-3AS(d) Epic
are used in the base stations. They feature a
high-power transmitter with a maximum power
output of 10 W and two separate receivers
with antenna diversity for use in many different
types of applications. “The high transmitting
power provides substantially improved coverage and a much larger range of up to 20 km
and more,” says Daniel Maurice, Sales
Director at Welotec. The two receivers –
which operate separately, each with its own
antenna – process the DGPS signals parallel
to each other and the modem then selects the
strongest signal. This substantially improves
the connection quality in mobile applications
and in regions with numerous reflections and
interferences. This allows positioning of the
Satelline-3ASd Epic modems in difficult locations and the high transmitting power makes it
possible to bridge long distances between the
modems even with small antennas.
Figure 3: In addition to Satel modems, Welotec also supplies antennas for the snow groomers: two antennas are installed in each driver´s cab.
(Photo: PowerGIS GmbH)
Optimally equipped snow groomers
The snow groomers are equipped with mobile modems of the type Satelline-EASyd. Their
low weight, the ability to display the received field strength and flexible configuration
and conversion options make them the optimal solution for installation in many different
types of snow groomers. “We need a highquality correction signal at all times throughout the entire ski regions,” says Christoph
Schmuck. “The modems feature all the settings we need to ensure maximum availability in ski regions with many mountains and
valleys.” The high frequency bandwidth of
the modems is also important to PowerGIS:
after all, the company serves the entire
Alpine region and each country uses different frequencies.
In addition to Satel modems, Welotec also
supplies antennas for the snowcats: two
antennas are installed in each driver´s cab.
One antenna is responsible for receiving the
correction signal and the other is used for
contact via GSM to the other snowcats. “The
choice of the optimal antenna allows significant advantages with respect to range,”
says Daniel Maurice. “That is especially
relevant in regions with poor accessibility.”
By the way: slope operators who use the system of PowerGIS benefit not only from optimally prepared ski regions – they can also
save up to 25 % in operating costs.
“However, the technology is only the necessary basis for this,” Christoph Schmuck emphasises. “The savings can be achieved only
if everyone involved – from the driver to the
snow machine operator, from the slope boss
to the company manager and the CEO – all
pull together.” In summary: when man and
technology optimally supplement each
other, the results – and the slopes – are
impressive.
Daniel Maurice, Vertriebsleiter, Welotec GmbH, Laer.
For more information, have a look at:
www.satel.com
December 2015
17
Marseille and 1Spatial Collaborate
Article
Creating a Central Source of Geospatial Truth
Figure 1: City of Marseille (source: City of Marseille)
18
By Thierry de Tombeur
The world is urbanizing at an astonishing rate. By 2050 it is expected that two
thirds of the world’s population will live in an urban area. These residents are increasingly mobile and tech-savvy, looking for tailored services, greater access to information and value for money from their cities. Spatial big data is underpinning the
response as, city administrations, utility companies, transport providers and others
are partnering to explore ways of combining technology trends (such as big data,
social media, and the Internet of Things) to better connect cities and citizens.
Marseille is a leader in creating a Smart City.
R
egional governments, local councils and mayors are increasingly coming to the conclusion that their cities need to be
smart to meet the development and improvement needs of
cities in the 21st century. According to a recent UN report (1),
54% of people worldwide live in cities and they “are where
the battle for sustainable development will be won or lost”.
As those working in the GIS space know, everything happens somewhere and only when you know where everything is can you create the connections that make cities smart. However, the challenge for modern cities
like Marseille is that spatial data gets big very quickly. For example,
alongside the location-specific material, spatial data systems may also
need to incorporate 3D information, residential records, citizen knowledge and historical data. Marseille needed to underpin its development
with trusted geospatial information to create one single source of reliable,
location-specific data.
France’s second largest city, Marseille has been inhabited since prehistoric times. The city now has a population of 860,000 and is a major center of trade and industry. Like many large, modern cities, its authorities
are particularly concerned with developing the local economy and man-
December 2015
aging the local environment; creating an attractive place for people and
businesses to locate.
So geospatial data is central to many of the issues the city needs to
manage and it has recognized that efficiently sharing authoritative data
is an important function. Pascal Giansily, the GIS Manager for Marseille,
explained to 1Spatial that the aim of the programme was , to provide the
tools and address the specific data needs of the municipalities, creating
business-oriented data for city managers.
Traditionally, Marseille’s geospatial data had been held in isolated silos
within different city departments and across the city’s arrondissements.
Sharing data from many sources with a wide set of users can quickly
expose inconsistencies and the city had experienced some problems with
data quality as a result.
Marseille also shares some management responsibilities (and related
data) with the wider Urban Community of Marseille Provence Métropole
(MPM, a group of 18 neighboring towns, centered on Marseille). For
example, Marseille is responsible for public lighting and parking areas,
but the geospatial data relating to the roads is the responsibility of MPM.
Similarly, responsibility for
create a list of potential event
schools, green spaces or
sites complete with informapublic sports facilities sits with
tion on safety requirements
Marseille, while MPM is
and capacity.
responsible for all surveying
and for short-term events such
The city’s Event and Party
as Marseille’s role as
department manages appliEuropean Capital of Culture
cations for public events such
(in 2013). Access to geospaas cycle racing or markets.
tial data and tools was restricThe department checks each
ted to users of PCs with the
application with other city
necessary desktop applicadepartments and emergency
Figure 2: Cadastral Base Map (source: City of Marseille)
tions. Separate applications
services to ensure there are
were required for different functions such as topography, GIS or CAD. IT
no planned road-works or other potential problems before issuing an autmanagement of these tools was a time-consuming burden, as each PC
horization. The team now uses Elyx Web to display the event location
had to be visited and updated individually.
and all relevant layers of information to anticipate any issues.
As a result, Marseille’s GIS team wanted to simplify the administration
and management of its geospatial data and tools. In doing so, the team
saw that they could make consistent, business-oriented data more widely
available and thus speed up decision-making for city managers.
It even helps with TV work. The popular television soap opera Plus Belle
la Vie is set in Marseille and the producers make regular requests to film
around the Vieux Port area of the city. Again, Elyx Web speeds up decision-making by enabling all city departments to quickly share relevant
information.
A single source of authoritative data
Marseille asked 1Spatial to help create a single source of authoritative
data, which would help resolve these issues. Together the teams developed a GIS solution that centered on a single, central data warehouse
(an Oracle Spatial database, known to the team as the Unitor) which
was accessed by 1Spatial’s Elyx Web software. The Unitor centralizes
all geospatial data, ensuring it is consistent, properly managed and authoritative. The data is managed using 1Spatial’s Elyx Manager desktop
software. Marseille also has 20 full Elyx Office desktop licenses for
“power-users”. However, 2,600 other users (across the City of Marseille
and MPM) now use Elyx Web, a web-browser based application, to
access the geospatial data they need.
According to Pascal Giansily, this works by using the web solution to
centralize IT management and saved time by not updating every workstation individually. It also has big advantages for users accessing different services. Before, everything was done on separate PC applications
and the web was only used for sharing results. Now the Marseille team
can even do the drawing over the web.
The effect of this implementation is that Marseille can now offer city
managers 1,000 layers of geospatial features, sorted by theme. These
include networks and cabling, urban planning, parking, taxi data and
video protection. Updates are fed into the Unitor through Elyx Web so
that the single, central database remains authoritative and there is no
duplication or inconsistency. As Pascal Giansily says, “Everything is centralized and managed within the data warehouse. It’s very powerful!”
Seamless and efficient data-sharing
The single source of authoritative data means that accurate, relevant spatial data can now be accessed by any authorized user without the need
to negotiate with different departments. The improved access and increased recognition of the value of spatial data means that Marseille’s data
is being used to improve decision-making in more and more scenarios.
For example; in 2013 when Marseille was the European Capital of
Culture and running many events, a project team used Elyx Web to easily
Finally, Marseille’s geospatial data warehouse can now easily feed data
to other applications, enabling the city to share valuable information with
residents and visitors. For example, Marseille now automatically produces public information on urban planning applications, removing what
was a regular administrative bottle-neck. Information on the location of
public facilities such as schools, town halls and car parks is also fed to a
public web portal.
Looking to the future
This isn’t the end of the process of being a smart city. Pascal Giansily
and his colleagues continue to add new data sets to Marseille’s data
warehouse, further increasing the value of this central, authoritative
view. At the same time, departments all across Marseille are finding
new applications for their geospatial data.
Several departments are exploring how they can use data to locate
employees and equipment as they move around the city. Another team
is using the data to assist in the placement of CCTV cameras. This is
city’s first use of three-dimensional geospatial data and it uses Elyx 3D’s
intervisibility capability to check the angle of a new camera for potential
blind-spots, shadowed areas or buildings obstructing the view. Pascal
Giansily is keen to explore further uses for Elyx 3D and the possibilities
that three-dimensional data offers.
For Marseille, the aim was to improve decision-making by providing
access to consistent, authoritative data which could be shared between
city and public applications. As Pascal Giansily notes, the city now has
a solution which works with everything related to data administration.
The challenge for Marseille is no longer to manage the tools, but to collect new data.
Thierry de Tombeur, France Account Manager, 1Spatial.
(1) A New Global Partnership: Eradicate Poverty and Transform Economies through Sustainable
Development (2013).
December 2015
19
New Technologies for City Asset Mapping
Article
Setting the Standard in City Digitalisation
20
Figure 1: Able to capture data indoors or outdoors, the Leica Pegasus:Backpack is an ideal solution for city asset collection. Checking mission status during reality capture is easy via the wireless tablet. © A.Breysse-WMAXXX
By Marie-Caroline Rondeau
Roads. Building facades. Road signs. These are just a few of the assets
surveying firm Cabinet Brière needed to map in the towns of Alès and Gap to
provide a 3D digital map of the communes in southern France. Needing to better
understand, operate and manage the country’s electricity distribution network,
using these two towns as standards, the Électricité Réseau Distribution France
(ERDF) contracted the 58-year-old firm known for revolutionising working
methods, both in the office and in the field, with new technologies.
A unique combination for innovation
In 2015, the French government instated the
Plan Corps de Rue Simplifié (PCRS) as a
means to share plans of infrastructure
between communities. This sharing of information promotes public safety and opens
dialogue between community leaders.
vehicle for 3D laser scanning of active environments. Jeudy and his partner, Guy
Perazio of Perazio Engineering, first tested
the Leica Pegasus in 2012. Both seeing the
opportunities in the mobile reality capture
technology, they decided to invest and develop more business with their new capabilities.
To meet this requirement, ERDF needed a
digital map of the entire cities of Alès and
Gap to compare the two cities’ historic and
more modern infrastructure assets. ERDF
turned to a pioneer in 3D surveying.
Cabinet Brière President Philippe Jeudy was
one of the first users of the Leica Pegasus, a
mobile reality capture platform attached to a
Three years later, when the duo responded to
the ERDF’s solicitation for digitalising the two
cities’ assets, Jeudy and Perazio were by this
time now experts in mobile reality capture.
With their even more versatile Leica
Pegasus:Two and the new Leica Pegasus:Backpack, the wearable reality capture
solution with a LiDAR profiler and five high-
December 2015
dynamic cameras, the team was able to capture more than ever before.
“With the Leica Pegasus, I can complete any
project without any technical limitations,” said
Jeudy. “Combining various forms of innovative technology, like SLAM (Simultaneous
Localisation and Mapping) and LiDAR, Leica
Pegasus provides an integrated yet unique
mapping solution.”
Leica Pegasus relies on the combination of a
point cloud acquired through a 3D laser
scanner, an image achieved through highdensity cameras, a GPS sensor for defining
the absolute position, and an inertial mea-
Figure 2: With the Leica Pegasus: Backpack, every detail in a typical cityscape
is captured. The completed dataset can be reviewed via the Leica Pegasus:
MapFactory for ArcGIS, an extension of Esri’s ArcGIS.
surement unit to record all movements.
When in GNSS-denied areas, such as under
bridges or inside enclosures, frequently
found in mapping city assets, the Leica
Pegasus: Backpack with SLAM technology is
the first positon-agnostic solution capable of
orientating itself to capture data. Working
with these images and point clouds together,
data is captured into a single platform and
workflow – from the operator interface to a
single-click post-processing.
pedestrian-only streets wearing the Backpack.
For his first outing with the new technology, he
found the familiarity accommodating.
can now feed the intelligent information into
ERDF’s database and communicate any new
developments or construction.
“With the Backpack, I was comfortable since
everyone knows how to wear one, which made
the acquisition simple and fun,” said Jeudy. "We
need to enjoy our daily work while also providing accurate deliverables, and with Leica
Pegasus, surveying becomes even more enjoyable and accurate."
According to Jeudy, the city of Alès is pleased
with the practical implementation of these new
and highly-innovation technologies. As one of
the pilot cities for ERDF, the city is setting the
standard for the rest of the country in city digitalisation.
The future of city asset mapping
Using Leica Pegasus, the team was able to
increase their productivity by an approximate
30 times, collecting a record of 600 kilometres
in five weeks for the entire city of Alès. Using
more traditional methods, such an acquisition
would have taken Jeudy more than two years.
The cities of Alès and Gap now have the complete digital map of their assets. City leaders
In Alès and Gap, Cabinet Brière realised more
than 90 percent of the asset acquisition through
the Leica Pegasus:Two, while the rest of the
acquisition was performed using the new Leica
Pegasus:Backpack. Able to go where vehicles
couldn’t, Jeudy was able to collect assets inside
homes, parks with thick overhead foliage and
“Thanks to the comprehensive acquisition with
the Leica Pegasus, the point clouds we produced gives others a basis for creating new
reports in 3D city modelling,” said Jeudy. “To
be the one to set that standard is a great
achievement for us and our client, the city of
Alès.”
For more information, have a look at: www.leica-geosystems.com.
Figure 3: The Leica Pegasus:Backpack is the wearable reality capture solution
combing five high dynamic cameras and LiDAR. The Leica Pegasus:MapFactory
software for ArcGIS enables the user to access both the point cloud and spherically stitch images within the same interface for faster content extraction.
© A.Breysse-WMA
December 2015
21
Data Delivery Without Software
Article
Pau River Flood Mapping
22
By the editors
Enabling the end user to access and utilize UAV data is a challenge currently faced by many
in the industry. This article describes a recent UAV project in France where high resolution,
spatially accurate, 3D data is delivered to the client, enabling them to view in 3D, digitize
and measure features on the model, without any software.
I
n October 2015, a series of violent
storms swept the South of France. The
intense runoff from the Pyrenees mountains caused severe damage along Le
Gave de Pau, changing the course of the
river significantly.
In order to quickly assess the extent of the
damage, in a real, measurable, geospatial
sense, “The Union” (the local government
body in charge of the embankments) looked
for a UAV solution. Manned aerial or satellite
data would not provide the required resolution, would be affected by cloud cover and be
less economical on a project of this scale. The
Union commissioned Drones Imaging (a geospatial data company) in partnership with AIR
CITY Diagnostic (a French certified UAV operator) to acquire accurate spatial data of the
devastating effects of the flood.
December 2015
Well equipped to collect and process data of
this kind, the team set to work. Not restricted
by cloud cover as with traditional aerial or
satellite operations, the project was flown the
day after commissioning and the data delivered after two days.
Accessing and utilizing the data
The study focused on eight different parts of
the river for a total length of about 1300m.
From high resolution geo-referenced aerial
images, a 3D digital surface model was produced, rendered with the orthophoto image.
This resulted in a spatially accurate model with
spectacular detail, to be used by the Union for
assessment of the damage and to determine
what future actions can be taken.
The Union, however, are not geospatial
experts and don’t run a GIS system to analyze
data like this. They required some basic
access to the data, to view changes in the river
and measure the extent of those changes.
Enabling the end user to access and utilize this
data is the challenge currently faced by many
in the industry. Drones Imaging was able to
deliver this high resolution, spatially accurate,
3D data to their client, enabling them to view
in 3D, digitize and measure features on the
model, without any software.
"Our principal business is the production and
delivery of geospatial data for all types of end
customers: very small companies to major
accounts. So, we are aware of the problems
of big data transfers and customers who do
not have software solutions (GIS) or computer
systems tailored to geospatial deliverables.”
Explains Loïc Hussenet, CEO & founder of
Drones Imaging.
A turnkey solution
Drones Imaging supplied The Union with a
series of eight orthomosaics (3GB in total), a
digital terrain model and vector digitizing on
the model. Normally this data would have
been sent via ftp or physical flash drive in the
mail, to be loaded onto the client’s computer
and accessed with GIS software. In this case,
the people that needed the data were not GIS
experts but executive, operations, and administration personnel.
“Considering the volume of data to transfer
(orthomosaics, plus digital terrain and markups), we decided to deliver our data using
4Dmapper. This offered a turnkey solution to
the client. They received their high resolution
3D data bundled in a platform for visualization and exploitation, without downloading
the data or operating a local GIS, something
that the customer does not always have.”
Equipped with their local fiber optic broadband, Drones Imaging uploaded the data into
4Dmapper in about ten minutes, where it was
automatically processed for streaming. They
were then able to immediately share the data
with the Union and its colleagues via a simple
email. The 3D data, and platform to access it,
was delivered with just a URL, to be opened in
the client’s browser. No software, no data,
just a browser.
The customer was at first surprised, and then
quickly embraced the user interface, with the
capacity to view in 3D, blend imagery with
background data, measure and digitize, all
on their browser. In turn they could share the
data with the various internal and external services involved in the project, including the City
of Pau technical services.
“At 4DMapper, our focus is leveraging the
value of geospatial data by enabling access to
the people that need it; executives, stakeholders, operations personnel and so on. We
make it easy to stream big 3D geospatial data.”
explains Rob Klau, Director of 4DMapper.
The boom in acquisition technology with
UAVs, laser scanners and high res satellite
and aerial imagery means more data, in 3D,
at higher resolution, and more often. Enabling
access to this data is essential, with simple
delivery, visualization and measurement tools,
to empower the people who need to use it.
“In this case, dissemination and exploitation of
our data was almost immediate for the client
and its partners. Today we are in talks with the
City of Pau to work on other projects” highlights Loïc Hussenet.
Watch this quick video showing the project delivery:
https://vimeo.com/144936777
For more information, have a look at: www.dronesimaging.com and
http://4dmapper.com.
23
3D STEREO
MAPPING SOLUTIONS
[email protected] l www.datem.com
+1 907.522.3681 l 800.770.3681
Anchorage, Alaska, USA
Breaking Down Barriers to Data Collection
Mapillary and the Power of
Article
By Peter Neubauer
Primitive cartographers once worked from their own visual analysis of the land, and
some of today’s most advanced mapping software works from the same perspective.
The machine vision experts behind Mapillary are innovating in the ever-expanding field
of mapping applications by utilizing street-level photos. The app collects location data
from crowdsourced images, creating a 3D reconstruction of the world. Once the map is
made, the uses multiply.
24
An example of a building reconstructed from photos. The position of the
original is labeled by the white rectangles. The color of each point, fixed
in three dimensions, is recovered from the original image.
Introduction
Jan Erik Solem makes computers see the world in 3D. He began in academia, completing his Ph.D. on facial recognition software. His doctorate served as the foundation for his first startup, Polar Rose, innovating
image search through computer vision, which he sold to Apple in 2010.
Solem wears many hats: associate professor at the Mathematical
Imaging Group at the Lund Institute of Technology; World Economic
Forum Technology Pioneer; founding member of the Lund University
Fund for Open Source Software; author of Programming Computer
Vision with Python. He’s also the co-founder and CEO of Mapillary, a
startup named for sprawling networks of roads that are capillaries for
human travel.
December 2015
Mapillary’s origins start very close to home for Solem. While Google’s
cameras drove down streets in major cities, his Swedish town, Bjärred,
remained undocumented. It became clear to him that the Google model
of street mapping the world was flawed. Photos, once taken, were rarely updated, and entire countries remain unmapped to this day. The
areas mapped by Google were chosen based on global priorities, not
their relevance to individual users.
Solem’s solution was to put mapping in the hands of the masses. With
two friends, he launched a mobile app and a website that allow anyone
to upload their own street-level photos. Mapillary was created to map
Community Mapping
every part of the world, with frequent updates and layered photos that
show a place through the perspective of the mapper. Anyone with a
GPS-enabled camera and an internet connection can contribute to this
street-level view of the world.
The streets of Bjärred are well mapped now, thanks to Solem’s weekend
bike rides. There are several routes through Aizuwakamatsu Castle in
Japan on Mapillary. The technology is finding humanitarian applications as well: the Red cross is documenting the streets of Haiti and the
World Bank is working in Dar es Saalam to identify areas at risk for flooding. These maps are coming from the communities that they belong to,
and being used and shared.
Competitive “landscape”
When it comes to the mapping industry, Google Street View dominates, as it does in many industries. But despite the brand recognition,
the applications of Google Street View’s photos remain exceedingly
limited. Once a user leaves the tech giant’s mapping platform, the
photos are unavailable. Other maps created by tech giants such as
Apple and Bing (which was acquired by Uber earlier this year), also
maintain proprietary photos.
It’s an oligarchy that has created frustrations for the GIS community
and for general consumers for years. For example, this summer
Strava users were frustrated when the switch from Google Maps to
Mapbox eliminated all of the street-view photos on the app, because
they were owned by Google. Mapillary’s cyclist user base has gone
as far as mapping half of Burning Man, and hikers have documented
travels all over the world.
Several new mapping clients emerged as a response to the limited
choice offered by the Googles of the world. Mapbox provides
custom online maps for all kinds of online and mobile platforms, and
OSM offers an open source, royalty-free, editable map of the world.
What sets Mapillary apart is the spirit of open source. Mapillary’s
photos are available under an open license, which allows the photos
to be used as needed, and the data from them to improve other maps
like OpenStreetMap. Additionally, many of its tools — including key
computer vision software — are open sourced and on Github. The
traffic sign font is open sourced. Not only is Mapillary crowdsourcing
its photos by the people who know the landscape best, but the company is also crowdsourcing its technology and data to build a peoplefirst mapping client and provide an alternative to Google Street View.
Mapillary currently has almost 40 million photos covering over one
million meters. With hundreds of thousands of photos added to the
platform every month, the race for a mapped world seems to be
going well. In 2013, Google Maps had around 10 million km of
road, the same year that Mapillary started. But comparing the mile-
age of dedicated vehicles with high end panoramic cameras to a
worldwide user base with mostly personal, hand-held camera phones
misses the true difference.
The photo database on Mapillary is dynamic and evolving in all corners of the world on a daily basis. If a group decides to map a city
in Africa in the morning, their photos could be accessible to the entire
world by teatime. It will take Google years to get significant coverage of developing nations. Google Street View has faced setbacks in
the developed world as well, with lawsuits and government intervention delaying coverage from Germany to Japan and Australia.
Machine vision with crowdsourced photos
Though the community contribution is straightforward, the work isn’t
over when the upload completes. The photos are first placed onto a
map through their GPS coordinates, then computer vision software
stitches them together according to space and time to provide an
immersive virtual view of the world. Computer vision also acts as a privacy mechanism — blurring faces and license plates — and a surveyor, mapping street signs and pavement quality without human input.
Mapillary supports any photo with GPS information, including panoramas, 360-degree photos, and photospheres that show the world
from every angle. Even full videos can be uploaded to be automatically broken into stills, which are then included in the map.
One of Mapillary’s most interesting projects is OpenSfM, an open
source method for reconstructing an environment in 3D with data
from multiple images called “Structure from Motion.” By computing
the relative camera positions of each photo, OpenSfM can not only
improve the positioning and connection between Mapillary photos,
but also reconstruct the shape of the landmark photographed and
extrapolate the surface of the structure. The end data is reminiscent of
a Lidar map, with relative depth found at low resolution.
Between one photo and the next, this 3D reconstruction makes all the
difference in Mapillary. In an area with a high density of photos, the
transition from one view to the next is continuous. The path of every
point is back-solved between one photo and the next, and the objects
can follow that path instead of relying only on the original photos.
The edge of a house doesn’t snap from the center of the screen to the
edge like a slideshow, but moves smoothly.
The 3D reconstruction is what allows the shapes to transform based
on perspective. Mapillary doesn’t just pin photos on the map; it integrates the photos with their context through advanced computer
vision across time and across photographers.
Cities at work
With over 40 million photos mapping over one million kilometers, the
massive participation in Mapillary is already impressive, but the com-
December 2015
25
Article
26
The capillaries correspond to the density of pictures uploaded by Mapillary’s community. The photos noticeably cluster around population hubs but more remote areas also get coverage, perhaps showing a
favorite hike or beautiful highway. The map is provided by Mapbox.
pany is more than a social network for photo-mapping enthusiasts.
Solem and his team are working to realize the potential of the wealth
of data contained in these street-level photos.
Mapillary isn’t the first to recognize the analytical potential of street
level photography. Municipalities from Sweden to Australia have
paid contractors to take photos of every road in their jurisdictions to
monitor pavement quality. Often, the third-party also host the photos
for a monthly fee and the cities never own the rights to the photos.
These companies, like Cyclomedia and Here Maps, resell access to
the photos to other entities for added income.
Conclusion
Still a new company, Mapillary is breaking down barriers to data
collection. The potential power of photo-documentation of the world’s
streets is hardly reached, and analytical efforts will continue to evolve. Collection methods change too. Humanitarian partnerships, from
the Red Cross in Haiti to the World Bank in Dar es Salaam, make distant streets globally accessible. Hikers and cyclists use maps to document their adventures, but new demographics will experiment and
find new applications. New technology leads to new ways of looking
at the world, in this case literally.
For more information, have a look at: www.mapillary.com.
Even if a city owns the rights to its photos, it would need a platform
from which to explore these images depicting thousands of miles of
roads. So even after the data is collected, the frustrations of using it
can overwhelm the benefits. The photos end up on a forgotten hard
drive instead of being a useful tool.
Mapillary’s benefit to municipal governments is that it can take a their
photos and host the collection for free on its public platform. If cities
want to ramp up their GIS capabilities, Mapillary offers the option to
couple the photos to ArcGIS, allowing the cities to securely update
and edit their internal GIS data directly.
A city doesn’t need a complete survey to take advantage of
Mapillary: its inspection teams can take photos that go straight back
to the GIS team, or local citizens can participate in the maintenance
of their home town by taking photos as they go about their day.
What used to be contingent on a third party is in the hands of the
community.
December 2015
Using Geographical Data
Driving EU Agricultural Payments in Austria
By Claudio Mingrino
In September 2013, Intergraph was contracted to undertake a major upgrade of the
IACS-GIS system at AMA to modernize and extend the capabilities, allowing farmers to
capture parcel extents and attributes via a Web Portal. In March 2015, the upgrade
project was officially completed with an initial roll-out to a select group of farmers followed by a second phase in May 2015, which brought in more farmers and institutions.
Article
Geomedia Smart Client UI
27
A
s part of its commitment to farmers, Austria works with the EU
to help control agricultural subsidies to the nation’s farmers.
AgrarMarkt Austria (AMA)
manages approximately €1.8 billion in subsidies and must ensure that eligible farmers
receive all the payments to which they are
entitled.
With such an important mission, AMA requires highly progressive geospatial data
management solutions. Because of their nature, these solutions must support real interaction with farmers who can submit geographic
and attributive details on their lands and
crops to qualify for their subsidies.
In 2003, AMA first contracted with Intergraph
to develop a system that would support the
management of funds and payments based on
geographic data. Built on ResPublica Intranet
GIS Software, this Integrated Accounting &
Control System (IACS) was the predecessor to
Hexagon Geospatial’s GeoMedia Smart
Client software.
Upgrade
In September 2013, Intergraph was contracted to undertake a major upgrade of the
IACS-GIS system at AMA to modernize and
extend the capabilities, allowing farmers to
capture parcel extents and attributes via a
Web Portal. The AMA selected GeoMedia
Smart Client for its ability to provide optimized, map-based workflows that enable the
capture and edit of farm field boundaries and
associated data. The software also allows the
AMA development staff to simplify the process of maintaining the agricultural data.
Essentially, these highly-honed processes provide farmers with tools for digitizing, splitting, merging, and measuring agricultural
parcels. Orthophoto data is delivered as a
backdrop using ECW streaming from Erdas
Apollo Essentials server from Hexagon
Geospatial. This optimized form of raster
delivery easily serves 5.5TB of imagery covering 100 percent of Austria, enabling
120,000 farmer-users to view highly detailed and accurate imagery to assist in digiti-
zing 4 million agricultural
land use areas.
In March 2015, the upgrade project was officially
completed with an initial
roll-out to a select group of
farmers followed by a second phase in May 2015,
which brought in more farmers and institutions.
In addition, GeoMedia Smart Client facilitates the production of 500,000 A3 color
prints automatically, twice a year. For years,
these paper prints have been utilized by farmers and inspectors on the ground to verify
system accuracy. In the next phase, AMA
will move to printing A3 color prints only on
demand, for inspection purposes.
For the AMA, this new upgrade allows the
organization to more effectively and accurately leverage geospatial data for handling
EU subsidies for farmers in Austria. The
effort also helps reinforce the value and viability of agriculture in a nation where this is
a major element of its economy.
Claudio Mingrino is the Executive Director for Europe, Middle East and
Africa for Hexagon Geospatial. He oversees the management and
development of the Hexagon Geospatial (HGD) portfolio sales through
the global Indirect Channel as well as developing strategic partnerships and alliances in Europe, Middle East and Africa for the growth of
the Geospatial business of HGD in all the market segments.
December 2015
Building Location into Apps
Esri European Developer Summit 2015
Event
By Eric van Rees
The Esri European Developer Summit in Berlin offered three days of
tech session on Esri’s latest geospatial technological developments.
Some of the take-aways included the new ArcGIS JavaScript API,
cross-platform development using Xamarin, and Quartz, Esri’s
next-generation version of ArcGIS Runtime.
28
Introduction
More than 300 geodevelopers gathered for
the Esri European Developer Summit, held in
the Berlin Congress Center from November
10th-12th. The program provided three days
of parallel tracks, offering more than 50
workshops devoted to building apps which
use Esri location technology, including
Python, JavaScript, .NET, OS X, and building
native apps for iOS and Android. Apart from
technical workshops, there were user presentations and a tech startup track. Various social
events completed the conference.
The event proved to be an excellent opportunity to learn all about Esri’s latest technology
updates - in fact, some North American visitors didn´t want to wait until the Developer
Summit in Palm Springs in March next year
and decided to join this event. They made a
wise decision in doing so, as the Esri staff
onsite showed the latest technology updates,
which they received just before they boarded
December 2015
the plane to Berlin. It was good to have the
event back in Europe, as the last event was
two years ago.
Plenary session
A three-hour plenary session opened the
event on Tuesday 10th November. The first
half of the plenary was used to discuss Esri’s
vision on location technology. Not surprisingly, the most important and dominant technology that underlies almost everything at Esri currently is web GIS: one look at the session
matrix from the event showed that the majority of the sessions were focusing on GIS and
the web. What used to be a back-end technology, is now distributed over the web through
browsers and devices. The other two initiatives discussed were 3D and real-time GIS.
In the field of 3D, Esri has a number of different software offerings all targeted at different
types of users. ArcGIS Pro is the latest Esri
Desktop application for 3D GIS, but there´s
more: Esri just released ArcGIS Earth in beta,
an application that works well for sharing 3D
information with users, for instance in the KML
data format. Real-time GIS is all about the
integration and exploitation of streaming data
into ArcGIS, where continuous processing
and real-time analytics takes place. Updates
and alerts are sent to those who need them
and to where they need it. The most popular
use case for this is most probably automotive,
but basically any device that has a physical
location can be monitored in real-time
(Internet of Things, IoT).
The second half of the plenary showed how
the developers work with different APIs to ensure everything works. “Developing for the
enterprise” was the header of Eaun Cameron’s presentation showing what Esri has been
developing lately for this field. He introduced
the Server Object Interceptors (SOI), which
are new at ArcGIS for Server 10.3.1. These
extend the capabilities of existing GIS servi-
Yann Cabon discussed what´s new for the ArcGIS API for Javascript 4.0.
Berlin, Germany (source: Wikipedia)
ces and allow users to intercept requests for
existing built-in operations of map or image
services. The use cases for SOI’s can be
found in security and data and business integration. Other worthwhile new releases are
ArcGIS Pro 1.2 (planned for Jan/Feb 2016),
an ArcObjects update and integration
between ArcGIS and R, a programming language and software environment for statistics
and data analysis used in the academic
world. During the plenary session, new recent
technological initiatives such as smart mapping and vector tile maps were presented and
a sneak-preview was given of GIS analysis on
devices, making use of GPU power.
ArcGIS API for JavaScript 4.0
The ArcGIS API for JavaScript is a lightweight
way to embed maps and tasks in web applications, bringing ArcGIS to any device. The
new ArcGIS API for Javascript 4.0 was discussed in the plenary and after that in great
detail during an hour-long session by Yann
Cabon. In a nutshell, the new API has been
simplified; it brings 3D capabilities, making
WebMap and WebScene first class citizens.
It shares lots of common patterns with Quartz
Runtime SDKs (see below) and APIs and widgets have been redesigned. In addition, if
offers a new architecture and also has a new
folder structure. The reason why it has a new
architecture is because the main idea is to
bring 3D into the API. To make this happen, a
WebGL engine will display a globe. There
will be new 3D symbols, support for simple
symbols and z/m support in the API. It will
offer a simpler and more consistent API
design, feature a mobile-first design and offer
better integration with frameworks. It is currently available in beta II version.
ArcGIS Runtime
For native app development, there was a lot
of news about ArcGIS Runtime. A tech session on how to get started with ArcGIS
Runtime discussed its architecture and what
you can do with the different Runtime SDK’s,
such as analysis, pop-ups and graphics.
Quartz is the name of the next generation of
ArcGIS Runtime. Although not released yet,
some of the new capabilities were discussed.
First of all, it is a major release with many
new capabilities with new and changed APIs,
offering an improved internal architecture. Its
goals are to offer support for the ArcGIS platform and ArcGIS Engine developers moving
to ArcGIS Runtime, synchronize APIs across
ArcGIS Runtime platforms and support specific user workflows. Quartz will include working with maps, portals, scenes for 3D and
layers. Working with ArcGIS Runtime has a
number of benefits: the main one being that
it´s integrated into the ArcGIS platform so that
it works with services and local content. It runs
natively on modern devices, exploiting the
capabilities of the device. Intuitive APIs make
it accessible to all developers, while SDKs
make developers more productive.
Xamarin
Coming out next year, is Xamarin, a third
party technology aligned very closely with
Microsoft that offers iOS and Android development. It is a software suite for cross-platform .NET development, offering libraries for
iOS and Android and development tools for
Windows and OS X. Esri’s Eaun Cameron,
Chief Technology Officer Applications and
Runtime explained the technology as follows:
“Xamarin offers a way for .NET developers to
take their C# or VB.NET development skills,
use Visual Studio as the development tool, but
compile natively down to Android or iOS.
We have Xamarin pieces that we will ship as
part of our .NET SDK, so as a .NET developer, you can write your own code. Primarily,
you write your code targeting form factors,
and then once you’ve got your code written,
you can very quickly move it from a Windows
phone to an iPhone or to an Android. It’s an
exciting technology, because the vast majority of our native developers today are using
.NET, on Windows.”
For more information, have a look at:
www.esri.com/events/devsummit-europe
The Esri DevSummit in Palm Springs will be held
from March 8-11, 2016.
December 2015
29
The Last of its Kind
Esri European User Conference
By Remco Takken
Traditionally, the European edition of Esri’s User Conference abounds with interesting
speakers and other participants from the organising party, including Californian headliners like Esri founder Jack Dangermond, or Esri’s National Security Coordinator Chris
Dorman, who provide the icing on the cake.
approval, rated tax). They also involve citizens to enhance administrative processes.
Event
Astrid
eighbour states can learn from each other, appreciate each
other’s differences and take instruction on the things they all
have in common: the ArcGIS platform. Esri President Jack
Dangermond: “Web GIS connects all our existing systems. It’s a system of systems.”
Astrid, a dispatch application in Belgium, was another winner of a
Special Achievement in GIS Award. For operators in the control
rooms, it is of utmost importance to locate an emergency call exactly
and quickly. The aim of the project was to build an efficient and userfriendly GIS platform in order to provide a wide range of geographic
information from a central GIS server to a large number of end-users.
Maps are never 100 per cent correct and up-to-date, as streets,
addresses etc. are sometimes missing or have incorrect information.
With the Astrid Geoportal, operators of emergency control rooms
can simultaneously consult several maps from various providers and
scan the databases in a single query to retrieve the information required to deal with the intervention. With the Geoportal and the cooperation of a large number of map data providers, Astrid establishes a
direct link between users and suppliers in order to provide faster
access to map updates with lower maintenance costs. With
Geoportal, the control room can also communicate with the units in
the field. The solution, implemented by Belgian system integrator
Siggis, is based on ArcGIS and on Latitude Geographics’ Geocortex.
Technical Sessions
Ireland’s Call
Technical evangelist Bernie Szukalski made long days during the User
Conference in Salzburg. He led his audiences through thoroughly allencompassing but also entertaining and informative sessions on Portals
& Content (the ‘living atlas’ as case in point), the use of Story Maps and
numerous sessions on ArcGIS Online. In one of his ‘lab workshops’,
Szusalski subtly showed how human communication skills have become
prevalent over programming skills in ArcGISOnline.
The ‘Maps Make Sense’ competition set entrants the challenge of creating a story map which would inform and inspire. The map that ‘met
and exceeded’ these criteria and was, therefore, selected as the overall
winning map was ‘Ireland's Call - to return its global diaspora home’.
30
Austrian Esri Distributor Synergis’ Peter Remesch welcomed his guests to the European EUC 2015
in Salzburg.
N
Amt der Landeshauptstadt Bregenz
One of the Special Achievement in GIS Awards went to Amt der
Landeshauptstadt Bregenz. They provide highly detailed and quality
proofed geodata of all sorts: cadastral, survey, infrastructure water,
sewer and tree inventory. The district capital provides up to date data
for their internally and externally shared GIS m which serves more
than twenty ArcGIS for Desktop Expert Users, in excess of a hundred
internal ArcGIS for Server users and more than one thousand daily
users thanks to ArcGIS for Server. Their winning system enables integrated vertical Workflows: a document management system, a resident register, property owner, tax register and CAD, so they can provide and support decision makers with up to date geoinformation for
fast and accurate administrative decisions. (e.g., constructive
December 2015
Every time technical evangelist and product strategist Bernie Szukalski took the stage, he enthusiastically
enlightened his audience with practical guidelines, do’s and don’ts and best practices on ArcGIS Online and
Story Maps.
The winning map was created by Rosita
Mahony at SPACEial North West for
Donegal County Council’s Diaspora
Project. The development of the story map
involved considerable research around the
Irish Diaspora community and the hard
economic facts and statistics to understand
why they had to leave. A keen desire to
find out where the Irish born population
now live, led to the discovery of the UN
dataset which allowed the time enabled
layer to be developed with the heat map
depicting the global spread very effectively. Considerable data mining and mapping was involved from many sources to
produce the ‘competitive house prices’ and
‘global job prospects’ web maps and the
research into using the analysis tools to
best visualise and symbolise the ‘healthiest
towns’ and ‘shortest commute’. Now
equipped with relevant information to base
their decision on, there is a call to action to
participate in the Donegal Diaspora
Global Skills Locator to connect with job
opportunities back home. It is hoped this
story map will reach the Irish Diaspora
abroad, making available relevant information via a series of web maps, to those
who are contemplating returning. The
story map brings emigrants on a journey
with hard economic facts to understand
why they had to leave but also, to inform
them that economically, things are now
improving and to visually present some
factors to consider for their return home.
Start-ups Presentations
Spanish, German, Portuguese, Swedish,
Turkish and Swahili.
1Spatial
CEO and Esri founder Jack Dangermond during his keynote speech at the European
EUC 2015 in Salzburg.
120 pupils from six schools in Salzburg worked on the development of a youth-centric
web map throughout the 2013/2014 school year. Every class in the school was
responsible for a particular task; project management, specifying requirements, collecting and managing spatial data and implementing the map. During the process, they
were guided and supported by the University of Salzburg/Faculty of GeoinformaticsZ_GIS, the City of Salzburg, and SynerGIS, the Esri Distributor in Austria.
At the Expo Esri partners British data quality specialist 1Spatial presented its new
product ‘1Integrate for ArcGIS’. It provides
automated data validation and management for the ArcGIS platform. This solution
helps improve data so that business processes are fit for purpose in decision
making, customer engagement and regulatory compliance. 1Integrate for ArcGIS
creates ‘smarter data for a smarter world’
by providing automated data validation
and management for the ArcGIS platform.
It helps to improve data across the enterprise, to ensure that business processes are
fit for their role supporting decisionmaking, customer engagement and regulatory compliance. True to 1Spatial’s traditional core business, which is data quality,
1Integrate allows the user to assess the
quality of data to ensure it meets defined
specifications and is fit for purpose. It also
performs data re-engineering tasks, such
as cleaning data, transforming data or creating new data from existing data assets.
Key features of 1Integrate for ArcGIS include: capture data management requirements as business rules, an automated
application of business rules to spatial and
non-spatial data, pinpointing the exact
location of errors in failing features, rulesbased data re-engineering tasks, automating the creation and update of datasets
and the integration or transformation of
datasets. It also creates and manages multiple rule sets for different data products
and services.
One of the interesting features of each
European User Conference is the inclusion
of up-and-coming start-up companies.
What3words is a global grid of 57 trillion
3mx3m squares. Each square has a threeThe last
word address that can be communicated
In the friendly surroundings of Salzburg
it became apparent that this was to be
quickly, easily and with no ambiguity. It’s
the last European User Conference as
a geocoding system for the simple commuwe know it. No official statements were
nication of precise locations, and it encomade, but various sources spread the
des geographic co-ordinates into three dicword that in coming years, the emphationary words. What3words is different
from other alphanumeric location systems One could view the Red Bull keynote as another ‘fun component’ interspersing a seri- sis will be on annual national conferenand GPS coordinates in that it doesn’t dis- ous conference, but Esri watchers recognised Johnson Kosgei who, before joining Red ces located in individual European
Bull, applied his expertise in GIS technology within The Greenbelt Movement, which
countries. Another possible new direcplay long strings of numbers or random letwas founded by Nobel Laureate Professor Wangari Maathai. His current role leading
tion might be seen in this year’s
ters or numbers. What3words has an iOS
the Enterprise Location Analytics center of excellence at Red Bull was recognized with
adjacent Geodesign Summit, also in
App, Android App, a website and an API
a Special Achievements Award by Esri's president, Jack Dangermond in 2013.
Salzburg during the same week in
that enables bi-directional conversion of
October. The same sources that announwhat3words address and latitude/longituced the end of the European User Conferences mentioned a future
de co-ordinates. It adds a level of specificity to postcodes and even
‘Transportation Summit’ as the first in a series of new initiatives to
places that don’t have an address at all. The main claimed advantages
cater to the ArcGIS community.
of what3words is memorability and the unambiguous nature of words
for most every day and non-technical uses. In addition to English,
For more information, have a look at: www.esri.com/events/euc.
what3words is available in eight other languages: French, Russian,
December 2015
31
A Multidisciplinary Framework
Event
Geodesign Summit Salzburg
32
Some of the biggest names in the industry gathered for the
European Geodesign Summit in Salzburg. From left to right:
Professor Henk Scholten (VU, Geodan, UNIGIS), Jack
Dangermond, CEO and founder of GIS vendor Esri and Professor
Carl Steinitz, godfather of the Geodesign framework.
By Remco Takken
In short, Geodesign allows GIS to integrate into the planning process. In order to ensure this
is done in a successful way, a framework is needed and, most importantly, the willingness of
those involved to collaborate with other disciplines; it is important to discuss and to listen.
D
uring the Geodesign Summit Europe, which was held in
Salzburg, Austria on October 12th and 13th 2015, the
broad use of Geodesign principles in practice was the
prevailing theme. For instance, Geodesign and geoinformation play a very important role in improving Crisis
Management. Tamara Ivelja, who fortunately remained fully intact
after working in middle European minefields, assessed the destructive
impact of natural disaster on existing minefields. Another interesting
example was presented by GIS Specialist Gijs van den Dool. He
showed how insurance losses in forestry are being predicted by
CoreLogic’s European Wind Storm Model and, combined with land
coverage data, tree types, elevation models, historic storms and
exposure risks, can be used to create a ‘forest module’ for insuring
forests. John Steenbruggen, PhD, of the Dutch Ministry of
Infrastructure, is looking at ‘neural network prediction for traffic incidents’. By linking typical delta water issues with problems related to
traffic, he explained the distinction between ‘blue and black spots’ in
road accidents in Holland. Having listened to these accounts, I was
left wondering if this really is Geodesign, or is it ‘just another cool
December 2015
GIS application-in-a-new-market’. However different in nature and
however bland or mundane the geographic component might seem,
the broad outline for true Geodesign is that at least some ‘future
looking statement’ should be extractable from an array of (non geo)
data, analysis and scenarios.
A Framework for Geodesign
Carl Steinitz is the godfather of Geodesign principles. His book,
which is entitled ‘A Framework for Geodesign’, although complex
and all-encompassing in nature, addresses just four entities: information technologies, design professions, geographic sciences and ‘the
people of the place’. It’s a complex and dynamic undertaking. “We
are designing change in many systems, which are interacting in
space and time”, says Carl Steinitz. During a ‘real life’ project
Steinitz, requests that its participants ‘keep it real’: “It’s about diagrams and simple models. I know I can do this in City Engine, but I
need the design first.” The need for a thorough theoretical basis is
also addressed in a surprisingly down-to-earth manner: “A framework protects you from stupidity twice”. Throughout the summit, it was
The Geodesign Summit encompasses multidisciplinary and collaborative efforts in
this specialised field. Professor Joseph
Ströbl of Salzburg raised the subject of
the appropriate one-liner ‘No More Silos’.
Keynote speaker Arancha Munoz-Criado successfully addressed metropolitan planning issues
versus green policies in the Valencia region in Spain.
easy to spot Steinitz’ pupils: anyone who put the ‘data, analysis,
design, sketch, evaluate: decision’ mantra in his or her PowerPoint
presentation, chew on at least some chunks of his framework.
Mostafa Elfouly, a researcher from München, is one of them. He
namechecks GML and City Modeller, but his presentation was on
‘Smart Growth’. He showed comprehensive scenarios and alternatives, whilst including indicators from heterogeneous domains. It was
apparent that there is a pictorial approach to Geodesign in
Mozambique. ‘Infill Analysis’ is also an aspect of geodesign says
Julia Reisemann, especially the development of vacant land within
existing neighbourhoods. Sebastian Cadus analyzed the interrelation
of mobility and residential costs. A collaboration between a Geo
Data Strategist and a 3D artist resulted in CIM City of Gothenburg.
Esri’s take
Esri CEO Jack Dangermond needed to be in Salzburg for Esri’s
European User Conference, but the fact that he made time to give a
number of the Geodesign Summit presentations his undivided attention is significant. During his own keynote speech, which was reminiscent of his talk in San Diego at the User Conference last summer,
he remarked: “Geography and Geodesign are now more important
than ever, providing the content and context for understanding our
world”.
He also explained how Geodesign applies geography, not only in
geographic knowledge, but also in the design process and in collaboration. Dangermond envisions geo-ICT tools such as GeoPlanner
and City Engine as ‘discipline independent tools and methods’ to be
employed in the phases of sketching, designing and evaluating future
(landscape) plans. A familiar phrase popped up; this time in the context of Geodesign: “GIS evolved from a ‘system of records’ to an
analytic tool and, more recently, became a ‘system of engagement’,
thanks to the web”. Whether through apps, the use of distributed services, a portal or a server or by integrating and simplifying, web GIS
will be ‘changing everything’, according to Dangermond.
Minecraft and citizen engagement
During the very first opening statement of the European Geodesign
Summit, Professor and co-organiser Henk Scholten mentioned the 3D
game Minecraft as ‘a geodesign tool for kids, i.e. our future spatial planners’. His colleague Eduardo Dias, PhD is well aware of the role of
gamification in spatial planning and Geodesign. Dias elaborated on
While estimating visibility of existing landmarks for urban planning, Florian
Albrecht surprisingly took a step back from IT tools and pondered on the
touch and feel of visibility. He sent a self-made ‘visibility study’ of paper
and wire around the audience.
‘The Geocraft Recipe’: getting the real world in 3D into the Minecraft
game. He asked himself why seventy million children all around the
world would be playing the ‘sandbox game’, MineCraft. Could this
game become the new GIS for the next generation? Or, would
Minecraft just be helpful in teaching them GIS and spatial thinking? Dias
explains the nationwide coverage of Holland in Minecraft. This was
achieved by data collection (elevation, land use and buildings), the preparation of this data (FME, ArcGIS, PostGIS, Clip, Mosaic, Interpolate)
and its conversion (using Python, pymclevel). The process resulted in
286,502 MineCraft tiles, equalling 1, 7 terabytes. Using 240 cores, the
computing was finished in two days and 21 hours. And then something
happened… Within Minecraft, the geo developers had disabled potentially harmful functionalities, but nevertheless, the were immediately hacked by a group of whippersnappers who left the dataset rendered completely useless within fifteen minutes of the official launch. The heavy
emphasis on Dutch 3D and Minecraft developments during the
European Geodesign Summit resulted in some mildly ventilated irritation
by Swedish and Irish speakers, who believed that similar milestones had
been reached in their countries. Ulf Mansson of Swedish engineering
firm SWECO showed how the Minecraft game skyrocketed in Sweden
after mainstream media announced the inclusion of Swedish geodata.
Future of Geodesign Summit
In Salzburg, many GIS specialists who visited the city for the other
Esri event found their way to the European Geodesign Summit.
Unfortunately the turnout for the latter was meagre, with only about a
hundred attendees in total. A drastic change will be needed in order
to maintain the innovative promise which the Geodesign principle
still holds. Architect Áine Ryan made an ironic but significant remark
just before she started her presentation. She pointed at a caricature
of a designer, shown earlier at the conference, portrayed in full glory
with artistic accessories, including a tie and glasses. Ryan quipped:
“Remember that person? Well, I am a designer just like that”.
Geodesign now needs to create a true link with other disciplines. This
needn’t mean a huge change for the organisation behind the summit.
There are plenty of important conferences on land use, landscape
architecture and spatial planning for fertile collaborations. The stories
are there, the theories are there, the technology is there and the core
community is there.
For more information, have a look at: http://geodesignsummit.com
December 2015
33
Studying Earthquake Behavior in Nepal
Urgent Action in the Himalaya
Article
By John Stenmark
Following April’s major earthquake in Nepal, massive efforts focused on providing aid to
victims and survivors. Working behind the scenes, a small team of scientists scrambled
to secure perishable data that could both explain how the quake occurred and help people
prepare for the next one.
34
Figure 1: A team working on a GNSS campaign station in Nepal.
Steep mountain terrain and narrow valleys provided challenges in
selecting GNSS observation sites.
December 2015
T
he trouble began 180 million years
ago. In southern Asia, relentless
movement of the Earth’s crustal plates caused the Oceanic Plate to
subduct beneath southern Tibet to
the north. The collision of the two plates,
which continues today, has buckled the crust
and pushed the rock up to create one of the
world’s great mountain ranges, the Himalaya. Today the Himalaya dominate the
landscape in northwest India, Nepal, Kashmir, Bhutan and southwestern China, including the former nation of Tibet.
It’s not a gentle collision. Riding on the
Oceanic Plate, the Indian subcontinent moves northward roughly 4 cm (1.6 in) each
year. Half of the motion is absorbed by the
Himalaya, pushing the mountains up. The
rest of the energy goes into squeezing the
rock along the boundary, or fault, between
the two plates. From time to time, the rock
ruptures to release the accumulated strain,
resulting in an earthquake. A major rupture
occurred on April 25, 2015, when the fault
broke in central Nepal, 15 km (9 mi) below
the surface and roughly 80 km (50 mi) northwest of Kathmandu. The resulting magnitude
7.8 earthquake killed more than 9,000
people, injured over 23,000 and damaged
or destroyed countless buildings and houses.
The April event was not the first quake to strike Nepal, and it unfortunately won’t be the
last. The behavior of the quake—and the
ways in which it could be studied—opens
the door for new understanding of future
earthquakes in the region.
The Well-measured Fault
Earthquakes in Nepal are neither predictable nor unexpected, says University of
Colorado geophysicist Dr. Roger Bilham.
The earliest earthquake in Nepal recorded
by humans occurred in 1255. Since then,
there have been at least ten quakes of magnitude 6.3 or more. Scientists note that the
region’s earthquakes appear to occur at
roughly consistent intervals with similar locations and behavior. Bilham explained that a
magnitude 8.4 quake in 1934 appears to
be a repeat of the 1255 event—the two
quakes occurred in the same location and
produced significant surface ruptures and
extensive damage. Likewise, the 2015 event
(known as the Gorkha quake) is very similar
to one that occurred in 1833. That quake,
estimated as magnitude 7.7, originated at
Figure 2: A GNSS station in Nepal. GNSS captured
slow tectonic motion over years as well as rapid
motion during the Gorkha quake.
35
the same location as the Gorkha epicenter
and produced 3.5 m of slip (11.5 ft), matching the slip of April’s magnitude 7.8
Gorkha event.
To make their analyses, Bilham and other
scientists rely on arrays of sensors to capture
data on the motion of the crust. Since the
1990s, a network of more than two dozen
GPS continuously operating reference stations (CORS) has collected data on plate
motion in Nepal. “Everything we know
about historical quakes comes from damaged buildings or evidence of surface ruptures,” Bilham said. “Today we have GPS and
other ways to measure vibration and displacement.” Seismic sensors can detect subtle motion and are very good at capturing
relatively small movements at high frequencies, but deriving accurate displacement
from acceleration data is an inexact science. Additionally, seismic sensors can become saturated by larger movement such as
experienced in a great earthquake. By using
GPS to directly measure displacement of
centimeters and more, researchers have
complementary sensors that provide a more
complete picture of plate motion and the
effects of earthquakes. In ideal cases, seismic sensors are collocated with GPS stations.
While it’s common to have GPS and seismic
networks in earthquake-prone areas, the
GPS network in Nepal provides unique
advantages in measuring the effects of earthquakes along subduction faults. Bilham pointed to recent strong quakes in Japan, Chile
and Sumatra. The fault ruptures in those
areas occurred along coastlines where it’s
not possible to use GPS to measure motion
on both sides of the fault. But in landlocked
Nepal, GPS sensors on the Indian and
Asian plates could precisely measure the
motion of the quake.
To provide a complete picture of the displacement, the GPS receivers in Nepal capture and store data at multiple recording
rates. Data collected at 15-second intervals
provides information on the normal, slow
plate motion over months and years. The
receivers in Nepal also captured high-rate
data five times per second (5Hz), which
could provide a detailed picture of the shaking during the quake itself. But when the
Gorkha quake struck and the data was
urgently needed, landslides and damage
made retrieving the data nearly impossible.
The person who could do it was on the other
side of the planet.
High-Pressure Data Recovery
If you are looking for someone to visit
remote sites to work with sophisticated electronics under difficult conditions, then John
Galetzka is your guy. Trained as a U.S.
Army Ranger and equipped with a bachelor’s degree in geosciences, Galetzka has
set up GPS networks around the world.
December 2015
Article
36
Figure 3: A multinational team sets up a campaign station following the Gorka quake. Sites were selected based on visibility to sky, security and accessibility.
During a stint with the U.S. Geological
Survey (USGS), Galetzka played a key role
in building the Southern California
Integrated GPS Network (SCIGN). He then
joined the California Institute of Technology
(Caltech) Tectonics Observatory, where he
worked on GPS networks in Sumatra,
Nepal, Chile and Peru. Working in Nepal
over a 10-year period, Galetzka had set up
28 GPS stations for Caltech and a 29th station shared with a French research agency.
(Roughly 20 additional GPS CORS in Nepal
are managed by other agencies. All the
CORS are in collaboration with the Nepal
Department of Mines and Geology.)
limited bandwidth of cellular connections.
So Galetzka configured the built-in storage
of the Trimble receivers to store several
weeks’ of 5Hz data. The remainder of the
receiver’s memory would store the 15second observations in case the cellular connection went down. “In the event of an earthquake, we could use the 15-second data to
look at the motion over several weeks or
months,” Galetzka explained. “But the 5Hz
data accumulates very rapidly. If not downloaded soon after an earthquake, the data
captured during the quake can be overwritten by newer data and the really important
data is lost.”
In 2013, Galetzka knew that funding for his
job with Caltech would soon end. He spent
the year installing fresh batteries and modernizing Caltech’s GPS stations in Nepal. (The
GPS equipment at the stations consisted of
Trimble NetRS, NetR8 and NetR9 reference
station receivers.) With permission from the
Nepalese government, Galetzka installed
cellular modems to push GPS data to FTP
servers in the U.S. The 15-second data could
be sent via cellular modem, but the volume
of 5Hz data was simply too much for the
In the two years since Galetzka had last visited the CORS sites, the cellular connections
to many stations had gone silent. At the time
of the quake, only nine stations were sending data. The status of the others was unknown. In addition to saving the 5Hz data
from being overwritten, scientists needed to
retrieve the 15-second observations to provide a complete picture of the fault’s behavior.
December 2015
When the Gorkha earthquake struck,
Galetzka was in Mexico on assignment from
his current employer, UNAVCO, a non-profit
consortium that facilitates geoscience
research and education using geodesy. “I
looked at my phone and saw all these small
earthquakes in Nepal,” Galetzka recalled.
“I scrolled down and finally found the big,
main shock. At that time I think the USGS
called it a 7.9. I woke my colleague, Luis
Salazar, and we were in shock at the size of
the earthquake. After thinking about it for a
few moments I told Luis, "I’ve got to go to
Nepal."” Four days later, Galetzka arrived
in Kathmandu.
While Galetzka worked his way to Nepal,
global relief efforts got underway. Nations
around the world sent rescue crews, medical
supplies, food and shelter to the stricken
region. In addition to humanitarian needs,
the scientific community began to organize
people and equipment to assist in securing
important geophysical data. One of the most
valuable responses came from Trimble,
which provided funding for helicopter time
needed to access remote GPS stations. The
company also donated seven Trimble NetR9
GNSS reference station receivers to replace
old or damaged equipment and perform
post-seismic monitoring. Trimble’s Mike
O’Grady, who had extensive experience in
Asia, hand carried the equipment to
Kathmandu and assisted in the effort.
In the days immediately after the quake,
science took a back seat to human needs.
Private and military helicopters were kept
busy on humanitarian missions. Thousands
of people were living under tents and tarps,
not because their homes collapsed, but out
of fear of another quake. The fears were
exacerbated with an aftershock on May 12,
a magnitude 7.3 tremor northeast of
Kathmandu that killed more than 200 people and set off additional smaller shocks.
Galetzka was pummeled by questions from
friends and people on the street: "What’s
going to happen today? Are we going to
get more earthquakes, more aftershocks? Is
the big one coming, what does the GPS
data show?"
Galetzka’s first few days in Nepal were a
blur. A typical day started around two or
three in the morning. “I remember waking
up because of jet lag,” he said, “but then
just because of sheer excitement. An aftershock might wake you up and then it’s just
impossible to get back to sleep. You’re thinking about what needs to be done.”
The rest of the day would be spent planning
for the next day or the next week, getting
people lined up to do vehicle missions to
download data where possible or helping
other projects. For example, Galetzka was
one of the few people familiar with a USGS
strong-motion accelerometer installed at the
American Club operated by the U.S. embassy. He was able to retrieve the data from
that instrument, which proved important in
analyzing shaking in Kathmandu. Others,
including Bilham, conducted damage assessments and looked for surface evidence related to the quake as well as helping to recover the GPS data.
The view from the helicopters was striking.
“In the rural areas, the damage to the villages was incredible,” O’Grady said. “Clay
and mud houses had collapsed. Most casualties occurred in the mountain villages.”
Helicopter missions to GPS stations often
included delivering food, medical supplies
and tents. “The pilot knew the area and
would land in places that needed help,”
O’Grady said. “We would unload the relief
aid and then go on to the GPS points.”
When the teams reached a GPS site, they
found differing degrees of damage, but the
integrity of the GPS data was consistently
good. “The receivers got knocked around a
bit, but none went down due to the quake,”
said Galetzka. “There’s no evidence that the
earthquake knocked out a receiver or some-
how damaged an antenna. The short-braced
monuments that we used are really solid and
worked well in accurately measuring how the
earth is moving. Overall, the network produced excellent results. There were stations
directly over the fault rupture. We’ve never
before seen or captured data like this.”
Soon after a receiver’s data could be recovered, it was processed for initial analysis.
Bilham said the quake started in the north,
where deep underground the fault slipped
as much as 5 to 6 meters (16 to 20 feet). As
the rock released the accumulated strain, the
quake ran out of steam. By the time it
reached Kathmandu, the slip had decreased
to centimeter levels. Galetzka used the information to refine his strategy for recovering
the GPS data. Because the quake had minimal motion in the western part of the country, he could give those GPS stations a lower
priority.
Unexpected Results
The data from GPS and seismic sensors
were put to work examining the quake behavior and effects. Analysis by Gavin Hayes at
USGS used strong motion accelerometer
and 5Hz GPS data to determine that in less
than 5 seconds the Kathmandu valley
heaved upwards by 60 cm (2 ft) and moved
southwest by 1.5 m (5 ft) at velocities of up
to 50 cm/s (1.6 ft/sec). In the following 60
seconds valley sediments oscillated laterally
at 4-second periods with 20-50 cm amplitude (0.6 to 1.6 ft). The
shaking created fissured ground near
the airport. Video
captured during the
quake shows pedestrians struggling to
remain standing. Hayes’ analysis revealed that surfaces
horizontal prior to
the quake are now tilted down to the
southwest, but by less
than 1 degree. Bilham noted the runway at Kathmandu’s
airport lifted roughly
50 cm (1.6 ft) and tilted by 12 cm (0.4 ft).
Early each day, Galetzka and his colleagues would go to the airport to check aircraft status. When a helicopter was not
occupied with humanitarian work, they
could use it to go out
for a few hours to
visit a station and
download data. If the
station was not telemetering data, they
would troubleshoot
and get it back online. If they couldn’t
get the receiver to
respond, they simply
swapped it for a new
one and took it back
to Kathmandu where
O’Grady, working in
a space provided by
the Toyota dealer,
could recover the
data and update the
receiver to make it
available for the next
Figure 4: A building in Kathmandu damaged by the Gorkha quake. Damage from shaking in the city was less severe than expected.
mission.
December 2015
Scientists were surprised by one aspect of
the Gorkha quake.
37
Article
38
According to USGS geophysicist Dr. Ken
Hudnut, the shaking in Kathmandu was not
as violent and severe as what would be
expected based on the large amount of
strain released in this fault rupture. Given the
energy released in the quake and typical
construction practices, damage to most buildings in the city was surprisingly light.
Hudnut said that more work is needed to
understand the surface motion associated
with the quake and how the movement of
smooth flat fault systems can translate to
motions at the surface. He said it’s also
important to know if the Gorkha quake put
additional stress on other faults in the area,
which could influence occurrence of future
earthquakes.
Efforts on the ground by Galetzka and others
will help provide the information Hudnut
described. Because the GPS equipment at
the existing stations was largely undamaged,
teams could use the receivers donated by
Trimble to establish several new monitoring
sites. Galetzka said that the new GNSScapable equipment allowed stations to be
located in places previously difficult for GPS
alone. “We couldn’t get on mountain tops,”
he explained. “So we were forced to put stations in some very deep valleys. With GNSS
capability, tracking not just GPS but also
GLONASS, Galileo and BeiDou, we can
track as many satellites as possible while still
being in a deep valley. It should increase the
data quality coming out of those stations.”
Galetzka added that a team from
Cambridge University is now working to
install seismic sensors at many CORS sites.
The GPS and GNSS stations are also providing benefits for Nepal’s surveying and engineering communities. Prior to the quake, the
nation’s geodetic framework was made up
of control points based on conventional surveying. The surface displacements of the
quake rendered all of the existing marks useless. Surveyors can use data from the CORS
to remeasure the marks and establish new
coordinates tied directly to the global reference frame.
Nepal’s GPS network continues to monitor
tectonic motion. The GPS data enables
researchers to model the strain accumulating
along the plate boundaries and estimate the
strength of upcoming quakes. Emphasizing
that the timing of quakes can’t be predicted,
Bilham focused on increasing understanding
of the accumulating strain. “This was a good
dry run for future, larger quakes,” he said. “It
affected a small part of the Himalaya and
drove home the need to strengthen their
homes and buildings. It was not the worst
that could have happened, but it is the worst
that will likely happen for a couple of decades.” Geophysicists can use information from
the Gorkha quake to advise local authorities
on the need for good building practices to
mitigate future damage and loss of life.
Galetzka agrees. “Even in the face of this
tragedy, I think Kathmandu dodged a huge
bullet,” he said, “I believe people realize
that. There’s a lot of tectonic energy still
remaining in that part of Nepal; it wasn’t
completely released in this earthquake. So
for me it was urgent to understand what the
earth did and what this means for the future
for the earthquake hazard in Nepal.”
For more information, have a look at www.trimble.com.
Figure 5: The Himalaya Mountains loom behind a CORS site north of Kathmandu. Fencing protects the GNSS equipment from damage by livestock.
December 2015
COLUMN
Our GIS Emphasis Should be
Business Outcomes not Maps
I
ts time again to revisit maps. In the surprisingly
popular blog post I recently wrote called
“Please stop calling me the mapping guy” I
took exception at being pigeon-holed. I (we) are
far more valuable to organizations than simply
being seen as producers of maps. We are solution
providers. Maps are simply a key output from our
work: intuitive and easy to understand. In this
column I will broaden the conversation.
What are business outcomes? There are five key
components:
1. Profitable growth
2. Customer engagement
3. Business sustainability
4. Productivity
5. Business agility
Five words: growth, engagement, sustainability,
productivity, agility. Technology is a key driver
behind business outcomes. Innovative organizations are looking for competitive advantages; how
to do things better, more efficiently, faster.
Business outcomes are not applicable just to private companies, all organizations are focused on
improvement.
GIS should be one of the core technology drivers
behind business outcomes. It should be a missioncritical business system. And yet adoption remains
slow. Why?
GIS: Still a Misunderstood Technology
Fundamentally, GIS remains misunderstood. And
in large part it is our fault.
Think about how successful any business analytics
organization or individual might be if conversations were led with: “I (we) provide charts and
spreadsheets to improve your business”. Charts
and spreadsheets!
No business professional leads conversations like
this. It makes no sense. But that is just what we
do with GIS. We lead our conversations emphasizing maps. The terms map and location mean little
to business leaders looking to solve problems.
Maps remain linked with discovery and routing:
how to get from A to B. Similarly terms like location intelligence, and location analytic’s are poorly understood. In so many words at WebMapSolutions we often hear: what do maps or location
have to do with our business outcomes?
Matt Sheehan is Principal and Senior
Developer at WebmapSolutions.
The company build location focused
mobile applications for GIS, mapping
and location based services (LBS).
Matt can be reached at
[email protected].
Start with the Problem
The bottom line is that we need to start with the problem. We need to be focused on business outcomes, or linking a problem with an outcome solved
by GIS technology. Let me say that again loudly:
We need to link business problems with
successful outcomes solved by GIS
Let’s provide an example here. In commercial real
estate we have a client whose process for collecting new properties for sale and listing these properties on their web site took days. Time is money
as they say, and our client wanted to automate
this process. Using Collector for ArcGIS,
Geoforms and some clever scripting, this process
now takes hours. Rich, accurate property data is
now collected by agents fast. Sure maps are part
of the listing output, but our client conversations
were centred on their vision: allowing agents to
spend more time selling!
GIS is changing, yet the language we use to
describe the power of GIS remains the same. We
have to get away from using terms like map and
location. Sure they are at the core of the technology, but they are poorly understood and carry past
associations. We need to be looking for problems
which are inherently location-based. And be focused in our conversations on the higher level solution using GIS, thinking carefully about how we
express that solution. What is an organizations
pain points and vision? That will demand new
thinking and a new approach on our part.
To be able to demonstrate how important GIS can
be to any organization, we need to be able to
show value. That means moving conversations
away from the technology and towards solving
real problems.
December 2015
39
An Interview with Mapbox CEO Eric Gundersen
Location and Interoperability
Interview
By Eric van Rees
40
After a large capital investment of 52.55 million dollars earlier this year, open source
API company, Mapbox announced their intentions to scale up big. Their technology has
been adopted by many companies, such as Esri, who recently embraced the Mapbox
vector tile spec, enabling faster web data visualization. Mapbox CEO Eric Gundersen
comments on the recent developments and explains what the company has in store for
the mapping world.
Introduction
company Esri announced that they intended to
IT company Mapbox is named after its mapadopt the Mapbox’ vector tile spec, instead of
ping platform which offers developers the builbuilding a new interface specification. Vector
ding blocks to integrate location into any
tiles are the vector data equivalent of image
mobile or online application. The platform is
tiles for web mapping and make for a highly
the foundation for other platforms, which
performant format that provide greater flexibiallows businesses to analyze their data,
lity in terms of styling, output format and interwhether it´s drone data, satellite imagery or
activity. The reason that Esri chose to adopt
real estate sites visualizing their properties.
the Mapbox vector tile spec is that it has becoThey are an open source company, meaning
me a community supported standard and
that their products are built with open source
makes use of the latest technology to meet
parts.
increased user expectations for web mapping
Gundersen explains the idea behind Mapbox:
and mapping on mobile devices.
“We´re an API and a data company, with a
Gundersen is enthusiastic about what´s going
focus on developers, so what we do is busito happen next: “this is taking video game
Mapbox CEO Eric Gundersen
ness-to-developers. We make it easy to grab
technology in order to be able to render out a
the exact chunks of geo and put them inside your application. We
map at 60 frames per second. This is the future face of how we´re
want to give people the Lego blocks which will enable them to tack
going to look at real-time data. It´s about being able to visualize mastogether the exact apps they want. You can compare us to Stripe,
sive amounts of data, whilst being highly efficient on mobile and in
which is a pure API company for enterprise.”
real-time. What´s cool about this, is that different technologies are
going to be able to talk to each other, creating real interoperability.”
The location component that Mapbox offers to enterprises is a full
map of the world that has been compiled from dozens of different
Scaling up
sources, including imagery, terrain, streets and road data. In just a
Last July, Mapbox announced its plans for the future after receiving a
few years’ time, the company has become one of the biggest provilarge capital investment. The idea is to scale things up, not only the
ders of custom online maps for major websites. Gundersen acknowcompany itself but also its product range, extending it to the whole
ledged that it´s necessary to invest in a good base map as part of a
mapping stack. Gundersen explains how the company is working
hard on improving its own product range: “firstly, you´re going to see
platform for other people to layer their own data on top: “We’ve got
us radically improving our geocoding. Secondly, we will continue to
the ability to create a level of customization for mapping, search and
make improvements on the Directions API, which provides worlddirections that developers can put right into their website. So when
wide driving, walking and cycling routes with turn-by-turn directions.
you go to Foursquare and look for a coffee shop, that´s been created
We’ll also be expanding that team. And thirdly, our data processing
by putting that info on top of our map. I think it´s important to let
pipeline for imagery is much better than before. Lastly, there´s
people customize a base map so it fits their brand and is presenting
Mapbox Studio that will be a game-changer for working with vector
the data the exact way they want to.”
data.”
Vector tile specification
Mapbox is also the creator of or, at the very least, a significant contributor to many popular open source mapping libraries and applications, including the MBTiles specification, the TileMill cartography
IDE, the Leaflet JavaScript library, the CartoCSS map styling language and parser, and the mapbox.js JavaScript library.
The company gained a lot of attention recently when GIS software
December 2015
Since this interview, the company has made improvements to geocoding, directions, and now has better
imagery and powers DigitalGlobe’s new Vivid offering, speaking to the robust processing pipeline.
Finally, Mapbox Studio has been released in private beta.
For more information, have a look at www.mapbox.com.
CLGE newsletter
Henning Elmstrøm looks back!
Henning Elmstrøm, CLGE President from 2005 till 2010, retired from our Council on 26th September during the
Moscow General Assembly. He received a standing ovation for his work with the Council. This interview is a welcome opportunity to allow him to outline some of his achievements whilst in office.
Muiris de Buitléir
Henning, you can look back over a long career with CLGE.
Please tell us what were the highlights for you?
42
together with the late Volkmar Teetzmann and Hagen Graeff, respectively
presidents of BDVI and DVW, as well as with some other German experts.
For many observers the history of CLGE goes back to 1962, when a
group was formed to follow up the construction of what would become
the European Union. This group was a special commission within FIG.
However, the history of the Council really began in 1972 concomitantly
with an extension of the European Community. The challenge for CLGE
cooperation with Europe was obvious, but the process proceeded at a
very slow pace for more than 25 years.
In 1999 the Danish Surveying Associations proposed me as their representative to CLGE. Three years later I was invited to join the executive
board and become vice president together with the late Klaus Rürup,
who, at that time, took over the presidency from Paddy Prendergast. After
four years of vice-presidency, I became the successor to Klaus Rürup for
two consecutive terms of office. My second term was somewhat longer
than the usual two years, because of the work involved in facilitating the
integration of CLGE and Geometer Europas, the association representing
publicly appointed private geodetic surveyors in Europe.
Handover ceremony between Klaus Rürup and Henning Elmstrøm
What progress did CLGE make during your terms as
vice-president and president?
Before accepting the invitation to join the CLGE executive board, I asked
that some changes in the governance of CLGE be considered. It was
important to encourage new growth.
A working plan was drafted during an inaugural meeting on “Das
Feuerschiff” in Hamburg. At that time CLGE was a rather tiny organisation. It was difficult to see its possibilities and to recognise what benefits it
might hold for its members. The future of the Council was discussed,
Henning Elmstrøm with the late Volkmar Teetzmann,
a pillar of CLGE, in Bucharest
It was decided to expand the organisation, based on preparations put in
place by the immediate past-president, Paddy Prendergast. The working
plan was written and submitted for the approval of the General
Assembly. The goal was to pave the way for the long term actions to
come. The first plan was written for the election period 2003 – 2005. It
was partly based on insights gained from the CLGE Role Paper (2001),
the Strategy Paper (November 2002) and the new statutes, as adopted
in October 2003, during the London General Assembly.
At that time the General Assemblies lasted for two days, as they do now,
but there was no real reporting to the national associations. This had to
change. Following our talks, it became clear that the accent must be
placed on Professional Education, Professional Practice and European
Union relations.
Henning Elmstrøm, in the centre of the photograph, vice-president in Bratislava, 2004. To the left, president
Klaus Rürup (DE) and secretary general Gerda Schennach (AT). To the right vice-presidents Marc Wijngaarde
(NL) and Vaclav Slaboch (CZ)
December 2015
In the period from 2001 to 2005 the role of CLGE was profoundly
changed to allow it to become an active professional organisation. The
General Assemblies were changed to include local presentations; lists of
seminars had to be planned. The idea was to represent us, not only in the
surveying arena, but also in associated areas, at the European level and
indeed worldwide.
A vice-presidency for European affairs was established. The relationship
with Geometer Europas was discussed and a possible merger was considered at that time.
In 2005, I was elected as president, while Gerda Schennach and René
Sonney were respectively secretary general and treasurer.
In December 2005 we organised a landmark one day conference in
Brussels to show how our events should be structured in the future. The
title of the event was “Professional Qualifications for Geodetic
Surveyors”; a precursor of work to follow in the field of the European
Directive on the Recognition of Professional Qualifications.
In 2007, Jean-Yves Pirlot took over from Gerda Schennach as secretary
general during the Luxemburg General Assembly.
From then on we organised major conferences every two years with
smaller seminars in between. It started in 2008 in Strasbourg with the first
CLGE Conference of the European Surveyor, two years later we held our
CLGE Conference in Bucharest, which focussed on the publicly appointed cadastral surveyor, with a milestone intervention by the French
philosopher André Jacquard. In between we organised the Bergen
Seminar about boundaries. All these, and other events, led to a number
of interesting results that can be seen on the website in the ‘basics’ section
www.clge.eu/about_us/basics.
Due to the initiative of our Secretary General we experienced our first
involvement with sponsoring. This was only possible after considerable
efforts to expand our Council to become a credible partner.
Lead by vice president Rob Mahoney, we also modernised the CLGE
website, our window to the outside world.
How do you see the evolution since then?
Jean-Yves Pirlot, who supported me during my two terms of office, followed me as CLGE President. He had to consolidate the association after
its rapid growth. During my terms of office CLGE acquired eight new
members. This level of growth would have been difficult to maintain and
from 2010 until 2014, we got a further four new members and three
observing members (Bosnia Herzegovina, Montenegro and Ukraine).
During this period, the increased visibility of CLGE became apparent.
Our monthly participation in GeoInformatics, for instance, stems from this
time. The contacts with the European commission increased too. A highlight of this cooperation was the organisation of the European Space
Expo, which was organised in Budapest during our general assembly,
with the help of the European GNSS Agency.
43
Up to 2005, CLGE was a rather introverted organisation. From 2005 to
2010 there was a period of enhancement, which allowed CLGE to
expand sufficiently to become a credible partner in the context of relations with the European Union.
My second presidency was from 2007 until 2010. As I said before, it
was a bit longer than the usual two years because we had to integrate
Geometer Europas. After a revision of the statutes, this independent
organisation became an essential and integrated part of CLGE. From
then on it would be known as the Interest Group of Publicly Appointed
and Regulated Liberal Surveyors or IG-PARLS. René Sonney, the long-time
and very trusted CLGE treasurer would soon retire to be replaced by the
Geometer Europas’ treasurer, Dieter Seitz.
During this period we also started to take an active role in INTERGEO.
FIG had recently established the Young Surveyors Network, I think that
was in Stockholm in 2008. This network got a boost in Europe with the
first CLGE Students’ Meeting that we organised in 2009 in Karlsruhe. The
same year we had a landmark event with the solemn signature of the
European Code of Conduct for Surveyors in Rome, in September 2009.
Henning Elmstroem signing the Code of Conduct, together with Fausto Savoldi, president of the Italian CNG
and Alain Gaudet (FR), president of Geometer Europas
The European Space Expo during the CLGE General Assembly in Budapest, 2013
CLGE also acquired an advisory role for impact studies launched by
the GSA. The annual budget was rapidly more than doubled, based
on an increased level of partnership with the private sector.
Workshops were introduced during the general assemblies and the
delegates became more and more conscious of the importance of
being part of these efforts.
The House of the European Surveyor and GeoInformation was inaugurated in October 2010 and has been improving every year since
then. The Day of the European Surveyor and the European Surveyor
of the Year became a reality. With the help of the American National
Society of Professional Surveyors it has now become an inspiration on
a worldwide level within FIG.
Other initiatives under my presidency should not be forgotten. The
matter of education, for instance, resulted in us becoming an active
partner in the European Leonardo Project: GeoSkills Plus. The goal of
this project was to bridge the gap between the needs of the labour
market and the young professionals produced by our Universities and
Technical High Schools. This was something I had anticipated during
my presidency.
In this period the tradition of CLGE conferences and seminars was
maintained. During the Hanover Conference in 2012, for instance,
the European real estate area label euREAL was solemnly launched. It
December 2015
CLGE newsletter
soon became part of European legislation via INSPIRE. Now it’s the
basis for the International Property Measurement Standard “IPMS”.
I am very proud of all these achievements.
What’s the future outlook, as you see it?
The future is in the hands of Maurice Barbieri and his team. They face
the challenge of preparing CLGE for a new expansion. New members are knocking at the door and hopefully they will join soon.
Maurice Barbieri must find a balance between our ambitions and the
available manpower. His proposal is to increase professionalization
with more responsibilities for individual delegates and I think that this
is a very wise and also a very necessary move. Another development
that I foresee is an increased cooperation with other European and
international organisations such as EuroGeographics, PCC, Eulis,
FIG, WPLA, IPMS, etc.). I also think that in the long run a permanent
office will be required, but these would be normal developments as a
result of what we’ve built since 2000.
CLGE takes an active part in FIG
44
During the FIG Working Week in Sofia, Maurice Barbieri, CLGE President, was asked to give a keynote on “Global
and Regional Professional and Institutional Reforms, some actions of the CLGE”. He found his time in Sofia very
worthwhile and confirmed CLGE’s willingness and commitment to a close cooperation with FIG.
Jean-Yves Pirlot
A
fter a general presentation from CLGE, including our strategy and some of our actions, Maurice Barbieri commented on the FIG statement as follows:
• In Europe, there is a strong focus on the mutual recognition of professional qualifications.
• The goal is to favour a strong internal market with sound competition based on the free movement of citizens, including professionals.
• CLGE agrees with this philosophy but we accept that the European
and National legislators must proceed with caution.
• Over regulation is undesirable, but under regulation can be harmful too.
• For cadastral surveying we are in favour of strong national regulation leading, in the best case, to the public appointment of surveyors. The achievement of this goal is being sought by CLGE’s interest group of publicly appointed and regulated liberal surveyors.
CLGE’s approach was then explained in more detail. Firstly, the new
Code of Professional Qualifications was presented. Together with our
Code of Conduct these provide us with a solid professional basis.
Reference was then made to the CLGE workshop on self-regulation
and co-regulation mechanisms. This was possible thanks to the cooperation of Jorge Pegado Liz from the European Economic and Social
Council in Limassol, in March 2015 and the subsequent approval of
the position paper presented by Mr. Pegado Liz, adopted by the
EESC on 22nd April.
As a conclusion Maurice Barbieri mentioned possible common goals
of CLGE and FIG and confirmed earlier proposals for future actions:
December 2015
Maurice Barbieri, second from the right.
• Influence:
Strong cooperation in the field for these stepping stones (ethics,
education, evolution)
• Try to implement the approach: ‘think globally, act locally’
• Support Bilateral MoU’s
• Create a council of regional bodies at FIG level
• Visibility:
Join the National Surveyors Week of the USA and the Day of the
European Surveyor to make it a global surveyors’ week.
• Renewal:
Confirm continued support to the FIG Young Surveyors’ Network
and to its European branch the FIG YSEN
To conclude, President Barbieri emphasised that state regulation or selfregulation are the strongest ways to ensure a high quality of work in order
to protect the real estate market…and to keep our profession vibrant!
Combining Theory and Practice
Learning Geospatial Analysis with Python
Introduction
This book is suitable for anyone wishing to
understand digital mapping and analysis and
who uses Python or another scripting language
for automation or crunching data manually. The
book primarily targets Python developers,
researchers and analysts who want to perform
spatial modeling and GIS analysis with Python.
The book has been written by Joel Lawhead, a
PMI-certified Project Management Professional
(PMP) and the Chief Information Officer (CIO)
for NVisionSolutions.com, which is an awardwinning firm specializing in geospatial technology integration and sensor engineering.
Lawhead has been published in two other editions of the Python Cookbook by O’Reilly and
has developed the open source Python
Shapefile Library (PyShp), which is also discussed in this book.
The book is available as a paperback and in a
number of digital formats. Code examples can
be downloaded from the publisher’s website.
The third party software and python libraries
that are discussed can be downloaded from the
internet. The book tells you where to find everything, but experience with installation procedures and modifying system settings manually
comes in handy.
from earlier chapters, resulting in a 500-line
Python script that is explained bit by bit.
Verdict
Before going into the qualities of the book, it
might make sense to state that at the time it first
came out, it was written to supplement two other
books about Python and geospatial from the
same publisher. This might explain the choices
made by the author regarding what to include in
the book and what to omit. The author includes
both QGIS and ArcGIS in his book, but without
any apparent preference for either package. So,
those searching for an application-specific book
or tutorial might be puzzled by the contents of this
book (arcpy for instance is not covered).
The fact that application-specific Python books
are already available explains why the author
chose a different approach here. The author’s
intent when writing this book was to stick to “pure
Title: Learning Geospatial
Analysis with Python
Python” as much as possible and to show how
Author: Joel Lawhead
much is possible with available libraries. And
Number of pages: 364
that’s quite a lot, although it can never replace the
Language: English
extras of high-end software packages. NonetPublisher: Packt
heless, as the author states, with (open source)
Year published: 2013
ISBN: 9781783281138
GIS software, data and python libraries you can
go a long way.
The author admits that some examples are simplified in the book, which
might give you an idea of what to expect. Another notable thing about the
topics discussed, is that there´s a lot of different topics, but they aren’t
Contents
explored in any great depth. LiDAR, for example, is mentioned, but there
The book is divided into ten chapters. The first three chapters contain
are no data samples or scripts for dealing with it. The last two chapters
a deep investigation on what geospatial analysis, geospatial data
deal with web mapping, and since this area has taken a big leap in the
and the geospatial technology landscape actually are. After this, the
last two years, I guess that the information presented is now somewhat outfocus shifts to Python. Firstly, the author elaborates on the role of
dated.
Python in the geospatial industry (GIS scripting language, mash-up
The people who are likely to get the most out of this book are experienced
glue language and full-blown programming language). Chapter five
Python programmers who want to know about geospatial packages, data
of the book focuses on applying Python to functions typically perforformats, workflows and tools. Additionally, GIS analysts (or remote senmed by a GIS, such as QGIS and Esri’s ArcGIS, and incorporates
sing analysts for that matter) might be interested in checking out the coding
measuring distance, converting coordinates and editing shapefiles.
examples offered in this book or investigating the Python libraries and
The last five chapters are about Python and remote sensing, consisting
modules discussed here. The code examples can be adapted to one´s
of large and complex datasets. The author states that Python is quite
own needs or wishes and the author encourages the reader to do this.
capable in this field and shows how to perform tasks such as swapping image bands, change detection and classifying images. Python
For more information, have a look at: www.packtpub.com.
can also be used for working with elevation data and creating a
Normalized Differential Vegetation Index (NDVI). The last two chapters are about creating mash-ups and combining different techniques
December 2015
Book review
By Eric van Rees
The number of books about GIS and Python keeps on growing. Packt Publishing offers
a number of Python books that address both open and closed source GIS. This particular
offering combines a number of tutorials on GIS and remote sensing using Python and a
theoretical framework on the topics discussed.
45
C a l e n d a r 2 0 1 5 - 2 0 1 6 / A d v e r t i se r s I n d e x
16-18 March GIS Ostrava 2016 - The Rise of Big
Spatial Data
Ostrava, Czech Republic
Internet: http://gis.vsb.cz/gisostrava
December
9-11 December 9th International Symposium on
Mobile Mapping Technology (MMT2015)
Sydney, Australia
Internet: www.mmt2015.org
23-24 March World Water Works
Antwerp Expo, Antwerp, Belgium
E-mail: [email protected]
Internet: www.worldwaterworks.nl
9-11 December SPATIAL the un-conference - Spatial
Information for Human Health
University of California, Santa Barbara, CA, U.S.A.
Internet: http://spatial.ucsb.edu/spatial2015
April
10-11 December GeoBIM
Amsterdam, The Netherlands
Internet: www.geo-bim.org/Europe
14-16 December USI 2015 - Unmanned Systems Institute
Sheraton Hotel & Marina, San Diego, CA, U.S.A.
Internet: www.unmannedsystemsinstitute.com 11-15 April ASPRS 2016 Annual Conference
Fort Worth Convention Center, Fort Worth, TX, U.S.A.
Internet: http://conferences.asprs.org/Fort-Worth-2016/blog
20-22 April Interexpo GEO-Siberia- 2016
Novosibirsk Expo Centre, Novosibirsk
26-27 April 2nd International Conference on
Geographical Information Systems Theory,
Applications and Management - GISTAM 2016
Rome, Italy
Internet: www.gistam.org
January 2016
21 January CHINA COMMERCIAL UAV SUMMIT ‘16
Shanghai, China
27-28 January SkyTech 2016
Business Design Centre, London, U.K.
Internet: www.skytechevent.com
May
27-28 January Geodesign Summit
Redlands, CA, U.S.A.
Internet: www.geodesignsummit.com
10-12 May Geospatial Conference in Tunis GCT2016
Hotel Le Palace, Gammarth, Tunis, Tunisia
E-mail: [email protected]
Internet: http://gct-tunisia.com
February
2-4 February The Unmanned Systems Expo (TUSExpo)
The World Forum, The Hague, The Netherlands
Internet: http://tusexpo.com
24-25 May GEO Business 2016
Business Design Centre, London, U.K.
E-mail: [email protected]
Internet: www.GeoBusinessShow.com
22-24 February International LiDAR Mapping Forum
(ILMF)
Hyatt Regency Denver, Denver, CO, U.S.A.
Internet: www.lidarmap.org/international/
31 May - 2 June Hexagon Geospatial Defence
Summit Western Europe
Vaalserberg, The Netherlands
Internet: http://2016.hexdefsummit.eu
27-29 June International Workshop on Risk
Information Management, Risk Models, and
Applications
Berlin, Germany
Internet: http://RIMMA2016.net
27 June - 1 July Esri User Conference 2016
San Diego Convention Center, San Diego, CA, U.S.A.
Internet: www.esri.com/events/user-conference
28 June - 7 July 16th International Multidisciplinary
Scientific GeoConference & EXPO SGEM2016
Flamingo Grand Congress Center, Albena Resort & SPA,
Bulgaria
E-mail: [email protected]
Internet: www.sgem.org
July
12-19 July ISPRS Prague 2016
Prague, Czech Republic
Internet: www.isprs2016-prague.com
August
24-26 August FOSS4G 2016
Bonn, Germany
Internet: http://2016.foss4g.org
September
26 February Esri DevSummit DC
Washington, DC, U.S.A.
Internet: www.esri.com/events/devsummit-dc
1-2 September The Commercial UAV Show Asia 2016
Suntec Convention Centre, Singapore
Internet: www.terrapinn.com/exhibition/commercial-uavasia/index.stm
June
13-15 June GNSS and Network RTK
Newcastle University, School of Civil Engineering and
Geosciences, U.K.
Internet: http://www.ncl.ac.uk/cegs.cpd/cpd/gnss.php
8-11 March Esri Developer Summit
Palm Springs, CA, U.S.A.
Internet: www.esri.com/events/devsummit
22-24 June GeoPython 2016
Basel, Switzerland
Internet: www.geopython.net
15-18 May GEOINT 2016
Gaylord Palms Resort, Orlando, FL,U.S.A.
Internet: http://geoint2016.com
18-19 February 7th International Conference
“Geodesy, Mine Survey and Aerial Photography.
At the turn of the centuries”
Moscow, Russia
E-mail: info@con-fig.com
Internet: www.con-fig.com
1-2 March map.apps Days
Münster, Germany
Internet: www.conterra.de
16-17 June High Precision GNSS using Post-Processing
Newcastle University, School of Civil Engineering and
Geosciences, U.K.
Internet: http://www.ncl.ac.uk/cegs.cpd/cpd/gnsspostprocess.php
2-5 May FOSS4G North America 2016
Raleigh, NC, U.S.A.
Internet: https://2016.foss4g-na.org
46
March
13-17 June FME Days
Zeche Zollverein, Essen, Germany
Internet: www.fme-days.com
13-16 June HxGNLIVE
Anaheim, CA, U.S.A.
Internet: http://hxgnlive.com/en/anaheim
October
11-13 October INTERGEO 2016
Hamburg, Germany
Internet: www.intergeo.de
18-20 October International Conference & Exhibition
Advanced Geospatial Science & Technology
(TeanGeo 2016)
Tunis, Tunisia
Internet: www.teangeo.org
Please feel free to e-mail your calendar notices to: [email protected]
Advertisers Index
Bentley
www.bentley.com
41
Leica Geosystems
www.leicageosystems.com
DAT/EM Systems
www.datem.com
23
RIEGL
www.riegl.com
2
KCS TraceME
www.trace.me
9
Topcon
www.topcon.eu
47
December 2015
48
Leica ScanStation P30/40
Because every detail matters
The right choice
Whether you want to digitally explore an archaeological
excavation or research historic monuments in 3D, when
recording and analysing heritage and archeology projects for
future generations, it is imperative to collect data with the
cleanest and most accurate results. The new ScanStation laser
scanners from Leica Geosystems are the right choice, because
every detail matters.
High performance under harsh conditions
The Leica ScanStations deliver highest quality 3D data and HDR
imaging at an extremely fast scan rate of 1 mio points per
second at ranges of up to 270 m. Unsurpassed range and angular
accuracy paired with low range noise and survey-grade dual-axis
compensation form the foundation for highly detailed 3D colour
point clouds mapped in realistic clarity.
Leica Geosystems AG
Heerbrugg, Switzerland
scanstation.leica-geosystems.com
Reduced downtime
The extremely durable new laser scanners perform even under
the toughest environmental conditions, such as extreme
temperatures ranging from – 20°C to + 50°C and comply with the
IP54 rating for dust and water resistance.
Complete scanning solution
Leica Geosystems offers the new Leica ScanStation portfolio as
an integrated part of a complete scanning solution including
hardware, software, service, training and support. 3D laser
scanner data can be processed in the industry’s leading 3D
point cloud software suite, which consists of Leica Cyclone
stand-alone software, Leica CloudWorx plug-in tools for CAD
systems and the free Leica TruView.