Difference between revisions of "Webapps with python"
(8 intermediate revisions by 2 users not shown) | |||
Line 12: | Line 12: | ||
===[https://gs.srv.pathirana.net/ Geohacking publications with Python Natural Language Processing and Friends]=== | ===[https://gs.srv.pathirana.net/ Geohacking publications with Python Natural Language Processing and Friends]=== | ||
<div style="overflow: hidden">[[file:nlpgeohacking_publications.png|thumb|center| | <div style="overflow: hidden">[[file:nlpgeohacking_publications.png|thumb|center|700px|Google Scholar results on the map [https://gs.srv.pathirana.net/ LINK]]]</div> | ||
==== What is this and how it works==== | ==== What is this and how it works==== | ||
Using Natural Language Processing to 'Geoparsing' google scholar search results. Select a keyword (or more) and see where the publications refer to. Click on a bubble of a country to see its publications listed below the map. | Using Natural Language Processing to 'Geoparsing' google scholar search results. Select a keyword (or more) and see where the publications refer to. Click on a bubble of a country to see its publications listed below the map. | ||
====Background==== | |||
Python provides all the tools needed to do Natural Language Processing, including | Python provides all the tools needed to do Natural Language Processing, including | ||
* Web scraping e.g. [ | * Web scraping e.g. [https://pypi.org/project/beautifulsoup4/ BeautifulSoup] | ||
* Parsing and identifying entities e.g. [ | * Parsing and identifying entities e.g. [https://www.nltk.org/ NLTK toolkit] | ||
* Flag geographical locations mentioned in the text and geolocating them (Geoparsing) | * Flag geographical locations mentioned in the text and geolocating them (Geoparsing). e.g. [https://github.com/openeventdata/mordecai Mordecai], [https://pypi.org/project/geograpy3/ geography3] and | ||
e.g. [ | (simple) [https://pypi.org/project/geotext/ geotext]. | ||
(simple)[ | |||
====What it does==== | ====What it does==== | ||
Line 139: | Line 138: | ||
If you don't have such images, just search the web and download a few. Then [https://cp.srv.pathirana.net/ go to the app] and upload them. (You can drag and drop them onto the app as well. | If you don't have such images, just search the web and download a few. Then [https://cp.srv.pathirana.net/ go to the app] and upload them. (You can drag and drop them onto the app as well. | ||
Or here is a sample (I just harvested from the web) to start with. Download the zip file, extract and drag and drop all the image files to the app! ([[file:webapps_with_python_concrete_images_sample_12.zip|zipfile]]) | |||
===[https://web2py.pathirana.net/urbangreenblue/default/index A simple front-end to a urban drainage/flood model]=== | ===[https://web2py.pathirana.net/urbangreenblue/default/index A simple front-end to a urban drainage/flood model]=== | ||
Line 160: | Line 160: | ||
[https://rwh1.srv.pathirana.net/ LINK] | [https://rwh1.srv.pathirana.net/ LINK] | ||
===[https://infil1.srv.pathirana.net/ Groundwater recharge calculator | ===[https://infil1.srv.pathirana.net/ Groundwater recharge calculator]=== | ||
The tool allows the user to choose this by providing the hourly rainfall rate and number of hours over which that intensity will continue. After providing the size of the infiltration pit, whether it is filled with gravel or just empty and the surrounding soil type, the user can calculate the performance of the system. The results are provided as a graphic as well as graph of water level in the structure and any overflow. | The tool allows the user to choose this by providing the hourly rainfall rate and number of hours over which that intensity will continue. After providing the size of the infiltration pit, whether it is filled with gravel or just empty and the surrounding soil type, the user can calculate the performance of the system. The results are provided as a graphic as well as graph of water level in the structure and any overflow. |
Latest revision as of 05:24, 1 February 2023
Webapps with python
For a long time, the tradition of providing technical solutions was the experts to do the designs and implement the designs for the benefit of the society. At a later stage, we started practising ‘awareness raising’. While this was a step in the right direction, by attempting to describe the why’s what’s and how’s of a technical solution to the societal stakeholders, awareness-raising often worked as an afterthought. A large body of evidence has shown that the best outcomes can be achieved by involving the community from day one of a technical solution. This day one is the planning and design stage. Whether it is money management in a family, a piece of policy in a government or an institution or any type of technical solution, best stakeholder support is obtained when people co-own the product. The journey to co-ownership starts with co-discovery (of knowledge), co-design and implementing together.
This applies to any type of technical solution. However, it becomes a non-negotiable requirement for success in climate and nature-based solutions. By their very nature, both the problem and its solutions are distributed in nature. Delivery of electricity from a customer from a thermal power plant also involves dealing with the ‘users’ – we call this customer management. But when we go for household level, grid-connected, solar electricity generation, that customer must become a business partner! That is the transformation we are witnessing in many sectors addressing problems with climate and nature-based solutions. This is how people are empowered to do designs. The pandemic is a portal, to make the wrongs right, and to build back better and greener.
One of the surefire ways of creating co-ownership is to encourage co-discovery and co-design. For this, we face the challenge of bringing modern technological knowledge to the stakeholders, including communities, in an understandable way but yet allowing for them to interact and contribute in a meaningful sense. One of the modern tools that contribute to this mission is interactive web applications. They allow water managers and scientists to bring complex data analysis solutions, big-data technologies, dynamic water models closer to the non-specialist stakeholders in an appealing and simple-to-interact fashion.
Demonstrations
Here are some prototype web applications that were created for the water management, agriculture and asset management sectors. They are written in python using libraries like Plotly dash and web2py to do the frontend. I use docker containers based on dokku -- a PaaS (Platform as a Service) --to host these apps.
Geohacking publications with Python Natural Language Processing and Friends
What is this and how it works
Using Natural Language Processing to 'Geoparsing' google scholar search results. Select a keyword (or more) and see where the publications refer to. Click on a bubble of a country to see its publications listed below the map.
Background
Python provides all the tools needed to do Natural Language Processing, including
- Web scraping e.g. BeautifulSoup
- Parsing and identifying entities e.g. NLTK toolkit
- Flag geographical locations mentioned in the text and geolocating them (Geoparsing). e.g. Mordecai, geography3 and
(simple) geotext.
What it does
- Downloads Google Scholar search hits for each keyword (In this demo, I have limited each to 500 top hits, to keep things simple)
- Store them in a NoSQL database (MongoDB)
- Run a geoparser (geotext in this case) to locate mentions of countries in the title or the abstract.
- Feed the data to this app, so that the user can interactively look at them.
How to use
- (After closing these instructions) Select a keyword. The locations of the publications will be shown on the map. A list of all the publications will be shown below the map.
- Click on the bubbles on the map to filter by country. Then the list will be updated to cover only that country.
- It is possible to select more than one keyword (simply select from the dropdown list)
- It's also possible to select several countries. Either SHIFT+Click on the map or use the select tools (top-right).
- Click on the link below each record to see it on google scholar.
What is missing
- Many publications concerning the United States of America, typically does not write the country name (e.g. A statewide assessment of mercury dynamics in North Carolina water bodies and fish).
NLP tools are usually smart enough to detect these (North Carolina is in the USA so tag as 'USA'), but the current (demo) implementation misses some obscure names.
- 'The United Kingdom vs. England' tagging is complicated. This issue has to be fixed (That's why no articles are tagged for England).
Next step?
This demo provides a framework for Natural Language Processing of online material to make sense of information (e.g. geoparsing). It combines several Big-data constructs (Unstructured data, NoSQL (Jason) data lakes, NLP tricks). While a web app is not the right place to scale up these to the big-data level, the framework presented here can easily be implemented to do large-scale processing using a decent cluster computer system.
With large scale applications some of the possibilities are:
- Identify temporal trends in publications.
- Locate 'hotspots' as well as locations with few (or no) studies (geographical gaps)
Seasonal levels in waterbodies using Sentinel-2 data
End-to-end automated system for calculating seasonal water availability in practically any waterbody.