More

Can I plot points on a map using JSON data from a Node.js API?


I am new to the arcGIS JavaScript API. I have a Node API that is providing geographical information in JSON. I have google'd for examples and searched for solutions on this web site. Can anyone point me towards an example where JSON data is used to plot points on a map?


this is the sample you should start by taking a look at.

in general, you need to use esriRequest to fetch your data, and then parse as appropriate before passing to a FeatureLayer or creating your own graphics and adding them to the map.


Create Maps that Show Paths Between Origins and Destinations in Tableau

You can create maps in Tableau Desktop that show paths between origins and destinations, similar to the examples below. These types of maps are called spider maps, or origin-destination maps.

Spider maps are great for when you’re working with hubs that connect to many surrounding points. They are an excellent way to show the path between an origin and one or more destination locations.

There are several ways to create spider maps in Tableau. This topic illustrates how to create a spider map using two examples. Follow the examples below to learn how to set up your data source, and build the view for two different spider maps.

For other examples that might fit closer to your data, see the following workbooks on Tableau Public:


The most essential APIs for any analytics systems for any metric or entity are:

  1. Ingestion API - An API to ingest the new data points for any particular entity. In our server for this blog post, we will make an API to ingest new temperature data at a particular time for London city. This API can be called by any global weather system or any IOT sensor.
  2. Historical Data API - This API will return all the data within a range from this date in time. For our server, we will create a simple API which will return some static historical data with limited data points for London’s temperature values for any day.

Node.js Express Server Skeleton

We will create a basic Express Server along with instantiating the Pusher library server instance. We will create a new folder for our project and create a new file server.js. Add the following code to this file:

API to get historical temperature data

Now, we will add some static data regarding London’s temperature at certain times during a day and store it in any JavaScript variable. We will also expose a route to return this data whenever someone invokes it using a GET HTTP call.

API to ingest temperature data point

Now we will add the code for exposing an API to ingest the temperature at a particular time. We will expose a GET HTTP API with temperature and time as query parameters. We will validate that they are not empty and store them by pushing in the dataPoints array of our static Javascript variable londonTempData. Please add the following code to the server.js file:

In the above code, apart from storing in the data source, we will also trigger an event ‘new-temperature’ on a new channel ‘london-temp-chart’. For every unique data source or a chart, you can create a new channel.

The event triggered by our server will be processed by the front end to update the chart/graph in realtime. The event can contain all the important data which the chart needs to display the data point correctly. In our case, we will be sending the temperature at the new time to our front end.


Updates

The information on this page is updated regularly.

Feature Flags

There are currently no feature flags.

Known Issues

Before contacting us, please review the following list of issues that have been identified for a future update.

Delayed observations

Observations processing of observation station data has a performance constraint. Observations may be delayed up to five minutes. The resolution does not have a deploy date.

Upstream Issues

The following issues are related to upstream sources of the API, and are not an API bug.

Delayed observations

Observations may be delayed up to 20 minutes from MADIS, the upstream source.

Resolutions

The following issues have been recently resolved.

HCE now provides Alaska Region marine products

With a recent upgrade that went into operations in mid-November, HCE now provides Alaska Marine products to the API.

Upcoming Items

The National Weather Service is seeking comment on stability measures across all web services including api.weather.gov. More information can be found by reviewing PNS 20-85.


Azure Subscription and Azure Maps

Azure Maps can be provisioned within any Azure Subscription just like all the other services within Azure. This enables you to create and manage the service just like you would any other Azure service that makes up your solutions / applications hosting environment. To get started provisioning the service, you can simply search the Azure Marketplace within the Azure Portal for “Azure Maps” and just get started creating an instance of the service for you application.

Provisioning an instance of Azure Maps is pretty simple. You just need to specify the Name and Pricing tier you wish to use. Then you can provision the Azure Maps Account and then get coding against the API’s using the Keys for your Azure Maps Account.

Once you provision a new Azure Maps Account within your Azure Subscription, you can get started using the Azure Maps JavaScript Control and Azure Maps REST APIs within your applications. The Azure Maps Account itself doesn’t have much in the way of configuration in the Azure Portal. Most notably the Azure Portal gives you access to the Keys you’ll need to configure custom code to authenticate against the Azure Maps APIs when executed. These are located within the Authentication tab for the Azure Maps Account in the Azure Portal. You can also use Azure AD authentication when interacting with the Azure Maps API’s too.


The Directions API overview

The Directions API is a web service that uses an HTTP request to return JSON or XML-formatted directions between locations. You can receive directions for several modes of transportation, such as transit, driving, walking, or cycling.

For direction calculations that respond in real time to user input (for example, within a user interface element), you can use the Directions API or, if you're using the Maps JavaScript API, use the Directions service. For server-side use, you can use Java Client, Python Client, Go Client and Node.js Client for Google Maps Services.


Interactive Data Visualization using D3.js, DC.js, Nodejs and MongoDB

Hello All,
The aim behind this blog post is to introduce open source business intelligence technologies and explore data using open source technologies like D3.js, DC.js, Nodejs and MongoDB.
Over the span of this post we will see the importance of the various components that we are using and we will do some code based customization as well.

The Need for Visualization:

Visualization is the so called front-end of modern business intelligence systems. I have been around in quite a few big data architecture discussions and to my surprise i found that most of the discussions are focused on the backend components: the repository, the ingestion framework, the data mart, the ETL engine, the data pipelines and then some visualization.

I might be biased in favor of the visualization technologies as i have been working on them for a long time. Needless to say visualization is as important as any other component of a system. I hope most of you will agree with me on that. Visualization is instrumental in inferring the trends from the data, spotting outliers and making sense of the data-points.
What they say is right, A picture is indeed worth a thousand words.

The components of our analysis and their function:

D3.js: A javascript based visualization engine which will render interactive charts and graphs based on the data.
Dc.js: A javascript based wrapper library for D3.js which makes plotting the charts a lot easier.
Crossfilter.js: A javascript based data manipulation library. Works splendid with dc.js. Enables two way data binding.
Node JS: Our powerful server which serves data to the visualization engine and also hosts the webpages and javascript libraries.
Mongo DB: The resident No-SQL database which will serve as a fantastic data repository for our project.

The Steps to Success:

  1. Identifying what our analysis will do
  2. Fetching the data and storing it in MongoDB
  3. Creating a Node.js server to get data from MongoDB and host it as an api
  4. Building our frontend using D3.js,Dc.js and some good old javascript.

Here’s what the end result will look like (might take a couple seconds to load):

STEP 1: Identifying our Analysis

We will be analyzing data from DonorsChoose.org, which is a US based non profit organization that allows individuals to donate money directly to public school classroom projects. We will get the dataset and try to create an informative analysis on the basis of the data attributes. We will have more clarity on this once we actually see the data set. I have taken a subset of the original data set for our analysis purposes. This dataset contains nearly 9000 rows.

Step 2: Fetching the data and storing it in MongoDB

The original dataset from DonorsChoose.org is available here. For our project we will be using a portion of the “Project” dataset. I have chosen this data as i am somewhat familiar with it. The “Project” dataset contains datapoints for the classroom projects data available with DonorsChoose.org. I have intentionally used a small subset of the data so that we can focus on getting the charts done quickly rather than waiting for the data to be fetched each time we refresh. That said, you can always ingest the original dataset once you have mastered this post. To download the required dataset click here.

Unzip the rar file to your desktop or a suitable directory.

After unzipping the downloaded file, we get a file named sampledata.csv with a size around 3 megabytes.The data is in csv (comma separated value) format. CSV files are a popular option for storing data in a tabular format. CSV files are easy ti interpret and parse. The file contains nearly 9,000 records and 44 attributes which we can use for our analysis.

Lets Install MongoDB. I am using a windows environment but the process is nearly similar on all platforms. The installation manuals can be found here. For the windows environment open your command prompt, Go to the folder where you installed mongodb and locate the bin folder. In my case the path is :

Fire up your mongoDB by running mongod.exe. Leave the prompt running as it is and open another command prompt and navigate to the bin directory again. Enter this command

C:Program FilesMongoDBServer3.0in>mongoimport -d donorschoose -c projects --type csv --headerline --file C:UsersAnmolDesktopsampledatasampledata.csv

You will see mongodb importing the datapoints from our data set into the database. This process might take some time.

While we are at it, i would strongly recommend using robomongo in case you are going to work with MongoDB on a regular basis. Robomongo is a GUI based mongoDB management tool. Install robomongo and open it. Create a new connection and enter localhost as the address and 27017 as the password and click save. Connect to this instance in the next screen that you get. The robomongo shell will look something like this:

Navigate to the projects collection and double click on it to see a list of the datapoints. Click on the tiny expand arrow on the datapoints list to see the full list of attributes for the document. These are the attributes (columns) that we just imported from our dataset. We will be using the following attributes for our analysis:

  1. school_state
  2. resource_type
  3. poverty_level
  4. date_posted
  5. total_donations
  6. funding_status
  7. grade_level

Step 3: Setting Up the Node.js Server

Now to the one of the happening server platforms: Node.js or Node JS or Nodejs, point is you cant ignore it!
I am quoting their website here:

Node.js® is a platform built on Chrome’s JavaScript runtime for easily building fast, scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices.

I find Node.js to be very fast and it works really well as far as connecting to mongodb is concerned.

To begin with we need to install Node.js and npm. npm is a package manager for node and helps us in easily deploying modules for added functionality.Here is a cool guide on installing node and npm for windows.

Our Node.js platform is broadly classified into the following folders:

  1. App Folder: Contains the models for data connections and api serving. Includes the files:
    1. SubjectViews.js: holds the data model. Specify specific queries (if any) and the collection name to fetch data from
    2. routes.js: Fetches data from the collection and serves it as an api
    1. DB.js: Contains the database information i.e. the address and the port to connect to

    The folder structure for our project will look like this:

    You can copy the full code and folder structure from the github repository here:

    Navigate to the home folder and run npm install command
    C:UsersAnmolDesktoplog>npm install

    NPM will read the dependencies from the package.json file and install them.

    Tip for ubuntu users installing node.js :

    To make those Node.js apps that reference “node” functional symbolic link it up:

    We now have our folder structure ready. Run this command :
    npm start
    You will see a message from node saying that the magic is happening on port 8080. (You can change the port number from the server.js file)

    Open your browser and go to localhost:8080/api/data (Defined in the routes.js file)

    Awesome. Now our API is all set! For the final part i.e. Creating visualizations for the data.

    4. Building our frontend using D3.js,Dc.js and some good old javascript.

    A lot of great Business Intelligence (BI) tools exist in the current landscape, Qlik,Spotfire,Tableau,Microstrategy to name a few. Along came D3.js, an awesome open source visualization library which utilized the power of the omnipresent javascript to make charting cool and put the control of the visualization design in user’s hands.

    We will be using a responsive html template for our design needs as our main aim is to get the charts up and running rather than code the responsiveness and the style of the various divs. I have used a nice synchronous template for this project.

    If you take a look at the node.js code you will observe that the static content will be served in the public folder. Hence we place our html stuff there.

    We will be utilizing the following libraries for visualization:

    1. D3.js: Which will render our charts. D3 creates svg based charts which are easily passed into out html blocks
    2. Dc.js: which we will use as a wrapper for D3.js, meaning we dont need to code each and every thing about the charts but just the basic parameters
    3. Crossfilter.js: which is used for exploring large multivariate datasets in the browser. Really great for slicing and dicing data.Enables drill down based analysis
    4. queue.js: An asynchronous helper library for data ingestion involving multiple api’s
    5. Dc.css : Contains the styling directives for our dc charts
    6. Dashboard.js: Will contain the code for our charts and graphs

    You can always refer to the code repository for the placement of these libraries. We need to include these libraries in our html page (index.html). Now, To the main task at hand: Coding the charts!

    In our Dashboard.js file we have the following :

    A queue() function which utilizes the queue library for asynchronous loading. It is helpful when you are trying to get data from multiple API’s for a single analysis. In our current project we don’t need the queue functionality, but its good to have a code than can be reused as per the need. The queue function process that data hosted at the API and inserts it into the apiData Variable.

    queue()
    .defer(d3.json, "/api/data")
    .await(makeGraphs)
    function makeGraphs(error, apiData) <

    Then we do some basic transformations on our data using the d3 functions. We pass the data inside the apiData variable into our dataSet variable. We then parse the date data type to suit our charting needs and set the data type of total_donations as a number using the + operator.

    var dataSet = apiData
    var dateFormat = d3.time.format("%m/%d/%Y")
    dataSet.forEach(function(d) <
    d.date_posted = dateFormat.parse(d.date_posted)
    d.date_posted.setDate(1)
    d.total_donations = +d.total_donations
    >)

    Next Steps are ingesting the data into a crossfilter instance and creating dimensions based on the crossfilter instance. Crossfilter acts as a two way data binding pipeline. Whenever you make a selection on the data, it is automatically applied to other charts as well enabling our drill down functionality.

    var ndx = crossfilter(dataSet)

    var datePosted = ndx.dimension(function(d) < return d.date_posted >)
    var gradeLevel = ndx.dimension(function(d) < return d.grade_level >)
    var resourceType = ndx.dimension(function(d) < return d.resource_type >)
    var fundingStatus = ndx.dimension(function(d) < return d.funding_status >)
    var povertyLevel = ndx.dimension(function(d) < return d.poverty_level >)
    var state = ndx.dimension(function(d) < return d.school_state >)
    var totalDonations = ndx.dimension(function(d) < return d.total_donations >)

    Now we calculate metrics and groups for grouping and counting our data.

    var projectsByDate = datePosted.group()
    var projectsByGrade = gradeLevel.group()
    var projectsByResourceType = resourceType.group()
    var projectsByFundingStatus = fundingStatus.group()
    var projectsByPovertyLevel = povertyLevel.group()
    var stateGroup = state.group()
    var all = ndx.groupAll()


    //Calculate Groups
    var totalDonationsState = state.group().reduceSum(function(d) <
    return d.total_donations
    >)
    var totalDonationsGrade = gradeLevel.group().reduceSum(function(d) <
    return d.grade_level
    >)
    var totalDonationsFundingStatus = fundingStatus.group().reduceSum(function(d) <
    return d.funding_status
    >)
    var netTotalDonations = ndx.groupAll().reduceSum(function(d) )

    Now we define the charts using DC.js library. Dc.js makes it easy to code good looking charts. Plus the dc library has a lot of charts to suit majority of anaysis. Checkout the github page for dc here.

    var dateChart = dc.lineChart("#date-chart")
    var gradeLevelChart = dc.rowChart("#grade-chart")
    var resourceTypeChart = dc.rowChart("#resource-chart")
    var fundingStatusChart = dc.pieChart("#funding-chart")
    var povertyLevelChart = dc.rowChart("#poverty-chart")
    var totalProjects = dc.numberDisplay("#total-projects")
    var netDonations = dc.numberDisplay("#net-donations")
    var stateDonations = dc.barChart("#state-donations")

    And the final part where we define our charts. We are using a combination of charts and widgets here. You may notice that we are essentially supplying basic information to the chart definitions like dimension,group, axes properties etc.

    // A dropdown widget
    selectField = dc.selectMenu('#menuselect')
    .dimension(state)
    .group(stateGroup)
    // Widget for seeing the rows selected and rows available in the dataset
    dc.dataCount("#row-selection")
    .dimension(ndx)
    .group(all)
    //A number chart
    totalProjects
    .formatNumber(d3.format("d"))
    .valueAccessor(function(d))
    .group(all)
    //Another number chart
    netDonations
    .formatNumber(d3.format("d"))
    .valueAccessor(function(d))
    .group(netTotalDonations)
    .formatNumber(d3.format(".3s"))
    //A line chart
    dateChart
    //.width(600)
    .height(220)
    .margins()
    .dimension(datePosted)
    .group(projectsByDate)
    .renderArea(true)
    .transitionDuration(500)
    .x(d3.time.scale().domain([minDate, maxDate]))
    .elasticY(true)
    .renderHorizontalGridLines(true)
    .renderVerticalGridLines(true)
    .xAxisLabel("Year")
    .yAxis().ticks(6)
    //A row chart
    resourceTypeChart
    //.width(300)
    .height(220)
    .dimension(resourceType)
    .group(projectsByResourceType)
    .elasticX(true)
    .xAxis().ticks(5)
    //Another row chart
    povertyLevelChart
    //.width(300)
    .height(220)
    .dimension(povertyLevel)
    .group(projectsByPovertyLevel)
    .xAxis().ticks(4)
    //Another row chart
    gradeLevelChart
    //.width(300)
    .height(220)
    .dimension(gradeLevel)
    .group(projectsByGrade)
    .xAxis().ticks(4)
    //A pie chart
    fundingStatusChart
    .height(220)
    //.width(350)
    .radius(90)
    .innerRadius(40)
    .transitionDuration(1000)
    .dimension(fundingStatus)
    .group(projectsByFundingStatus)
    //A bar chart
    stateDonations
    //.width(800)
    .height(220)
    .transitionDuration(1000)
    .dimension(state)
    .group(totalDonationsState)
    .margins()
    .centerBar(false)
    .gap(5)
    .elasticY(true)
    .x(d3.scale.ordinal().domain(state))
    .xUnits(dc.units.ordinal)
    .renderHorizontalGridLines(true)
    .renderVerticalGridLines(true)
    .ordering(function(d))
    .yAxis().tickFormat(d3.format("s"))

    And finally we call the dc render function which renders our charts.

    dc.renderAll()

    Mission Accomplished!

    Open your browser and go to localhost:8080/index.html to see your dashboard in action.

    There is a lot of customization that can be done to the charts. I did not delve into them at this stage. We can format the axes, the colors, the labels, the titles and a whole lot of things using dc.js, d3.js and CSS. Moving on i will be taking up one chart at a time and provide additional examples of what all we can customize.

    At the end we now have some knowledge of MongoDB, Nodejs, D3. You can use this project as a boilerplate for exploring and analysing new data sets. All the source code can be found in this github repository.

    I will be most happy to answer your questions and queries.Please leave them in the comments.

    Do share the post and spread the good word. It may help folks out there get to speed on this open source visualization stack.


    Can I plot points on a map using JSON data from a Node.js API? - Geographic Information Systems

    Map mobility data in a NodeJS + React front-end application with data served by magicbox-open-api

    magicbox-maps is a React front-end application that serves data from the magicbox-open-api. magicbox-maps works with different types of data sets, so you can show relationships between different data sets in a geographic map. These data sets include school location and other key attributes as well as information on school Internet connectivity, both in terms of speed (Mbs) and type (2G and 3G).

    magicbox-maps uses WebGL and a component of React to render plot points across OpenStreetMap leaflets. The countries displayed are organized by geospatial shapefiles, provided by various sources, like GADM.

    The UNICEF Office of Innovation uses magicbox-maps for two purposes:

    Mapping schools helps us understand…

    Mapping mobility of people helps us understand movement patterns. In the case of disease outbreak (e.g. Zika, Dengue fever, cholera, etc.), mobility data helps countries develop deeper insights to disease prevention and response with vaccination campaigns or moving response resources into place. For sudden, mass movement of people (e.g. refugee crisis), mobility data helps local governments anticipate an influx of people in advance and to respond appropriately with relief resources.

    This section documents a development environment, not a production instance. Please reach out to @UNICEFinnovate on Twitter for more information about using MagicBox in production.

    Clone repo, copy sample config:

    Install dependencies for server back-end:

    Install dependencies for React front-end:

    The configuration file goes in react-app/src/config.js . A sample config is included (see below).

    magicbox-maps only works if a valid magicbox-open-api instance is running. See how to set it up in the API README.

    The magicbox-maps back-end server and front-end React application must be running at the same time.

    Run the server:

    Run the front-end:

    Interested in contributing? Read our contribution guidelines for help on getting started.

    Our team tries to review new contributions and issues on a weekly basis. Expect a response on new pull requests within five business days (Mon-Fri). If you don't receive any feedback, please follow up with a new comment!

    This project is licensed under the BSD 3-Clause License.


    Exceptions Related to JSON Library in Python:

    • Class json.JSONDecoderError handles the exception related to decoding operation. and it's a subclass of ValueError.
    • Exception - json.JSONDecoderError(msg, doc)
    • Parameters of Exception are,
      • msg – Unformatted Error message
      • doc – JSON docs parsed
      • pos – start index of doc when it's failed
      • lineno – line no shows correspond to pos
      • colon – column no correspond to pos

      Python load JSON from file Example:


      Conclusion

      I have listed the best JavaScript charting libraries out there, at least those I consider the top ones. It would be hard to compare all of them comprehensively. Each one of them has its own pros and cons depending on who is going to use it and for exactly what purpose.

      Of course, there are some features that make one library faster, more beautiful or flexible than the other. But in the end, no matter what libraries this list contains, the overall winner is always the one that meets your specific requirements. For different people and companies, the choice of the best JS chart library can also be different.

      My advice is — check out these top libraries as whenever you need JS charts and for whatever project, chances are extremely high that you will find one or several of them to be the best fit. For a longer list, look at a comparison on Wikipedia.

      If you need to visualize data in interactive maps, focusing on geographic trends, relationships, connections, flows, and so on — you are welcome to read my earlier article about the best JavaScript libraries for creating map charts.