Skip to main content

jelaskan internet domain survey host count




prem ramaswami: so my nameis prem ramaswami. i'm a product manager withgoogle.org, which is google's technology-drivenphilanthropy. and i'm here today with mycolleague, steve hakusa, who's a software engineer. and steve and i together workon google.org's crisis response team, which looks tomake crucial information available in the aftermathof a natural disaster. now, we're here today to talkabout how developers like


yourselves can feel empowered inthe aftermath of a crisis. we're always interested in yourfeedback, which we'd love for you to share, either on thegoo.gl links or any of the twitter hashtags behind me. and with that, i'd like to getstarted with our story. as a team, our story starts onjanuary 12 of last year. now on this day, you mightremember, there was a 7.0 magnitude earthquake that hitthe island nation of haiti outside of their capitalcity of port-au-prince.


now this earthquake endedup killing over a quarter million people. it left another millionhomeless, and two million in need of food and water aid. as images started escapingfrom haiti, a group of us googlers got together and wantedto see what we could do to possibly help out. now google has been respondingto crises since about hurricane katrina.


and the way we do that is bycollecting updated satellite imagery and publishing it toour properties like google maps and google earth. aid agencies then usethat imagery. so in the aftermath of thehaiti earthquake, we were actually able to collect acloud-free shot over haiti with our partner goi. and we were able to publish thisimagery to google maps and google earth withinthe first 24 hours.


now this imagery really gaveus an understanding of the level of devastation anddestruction that occurred over the country. now i said that aidorganizations make use of this imagery. it's not just there for theshock value of it, so i want to get some key examples ofhow aid organizations take advantage of this. the world bank, for example,used this to conduct wide area


damage assessments. other organizations used it toidentify the location of refugee camps. it was also a nice way tovisualize the location of all your clinics on a map thatwere popping up in the aftermath of this disaster. now you can imagine why havingthis information geographically placedis so important. for example, you want to makesure these clinics are set up


next to your refugee camps soyou're properly servicing the right populations. and i want to give it anotherspecific example of where satellite imagery issuper helpful. this is the petionville golfcourse, which is located about an hour south of the cityof port-au-prince. and this is a satellite imageof the golf course about six months before the earthquake. a day after the earthquake, yousee little white specks


popping up on the golf course. and these are actually tents. individuals who live in thesurrounding areas, whose houses are either destroyedor they feel unsafe with a concrete roof over theirhead, have now moved onto this golf course. and 13 days later the entiregolf course has become a refugee camp, one of thelargest in haiti. now camps like this are easilyidentifiable by viewing


time-lapse satellite imagery. not just that, by knowing wherethese refugee camps are, aid organizations can make surethey're appropriately allocating their resources. but as i said, we've been usingthe satellite imagery since about hurricane katrina. so when this group of googlersgot together, we asked ourselves, what elsecould we do here? now, google's mission, as acompany, is to organize the


world's information andmake it universally accessible and useful. and so we decided we'dorganize all of the information related tothis disaster in one place, on a web page. and we call thisa landing page. and so we started organizingthis information, which included things such as whereto donate, along with actual ways that individuals couldhelp, news and updates that


were coming out of the region,relevant maps, as well as user-generated content suchas youtube videos. this page was launched in 12different languages, and it was linked to from thegoogle.com main page as well as from what we call enhancedsearch results. so if you look for somethingrelated to haiti earthquake from google.com, you'd belinked to this page. i wanted to spend the next fewminutes going through some of the specific features thatwere on this page.


and one of those products thatwe actually launched was a tool called person finder. see, after the haiti earthquake,there were more than 14 separate sites thatpopped up to take care of missing persons informationonline. now all of these sites werereferring to missing people with different standards. they were using differentinfrastructure. none of them were integrated.


which meant that, if i weremissing a relative after the earthquake, i would have to goto each one of the 14 sites, search for them, enter theirinformation, and keep continuously looking atall 14 to try to find updates about them. now this was a place wheregooglers felts we could really help out. see, we wanted to build a simpleweb application that was super easy to use.


we wanted to use an openstandard, specifically the pfif standard, or the personfinder interchange format. and we wanted to make the toolopen, so we'd give open access via an api so all of thesedifferent sites could push and pull their data toand from it. and so we had this 72-hourhack-a-thon that occurred globally, in mountain view, newyork, zurich, and israel. and engineers around the globeworked and launched the person finder web applicationon app engine.


and we noticed somethinginteresting happen, sites like cnn, npr, the new york timesintegrated to person finder. and as a victim, i no longerneeded to go to all 14 separate sites. i could go to any one of themand now see that same information mirrored across. now another area where wethought we could help out was with relevant mappinginformation. see there was a lot of maps thatwere being created in the


aftermath of the earthquake. and this included stuff likethe world bank's damage assessment i showed earlier. or even things as simpleas earthquake locations from the usgs. now the problem was, thisinformation was strewn across the web. and it was in differentformats, often not machine-readable formats.


and so our engineers crawled theweb, found a lot of this information, scraped it,converted it to a common format like kml, and then placedit on this common map that we then link to fromour landing page. now you can just click on andoff any of the layers here, and as an aid organization,or someone interested in understanding what's happeningon the ground, you could get more of a complete picturebecause you could layer on different layers from differentorganizations.


and it wasn't just googlethat was helping out in the tech space. there were organizationslike open street map. so open street map is acollaborative mapping platform very similar to google'sown map maker. now what open street map did wasthey took advantage of the local haitian diaspora thatwas around the world with knowledge, with relevantknowledge about haiti's roads to actually digitally drawthem out on a map.


now why is this actuallyuseful? as an aid organization, when icome to haiti and i want to deliver my aid from pointa to point b, i need to know how to get there. and the fact is, there were nogood maps to help people out. the work done by the haitiandiaspora community and open street map ended up being quotedas the most complete digital maps of haiti'sroads ever created. and it was hugely important.


ushahidi also emerged as a veryimportant platform after the earthquake. ushahidi was initially createdto report post-election violence in kenya. but it's a simple incidentreporting platform. now what that means is,individuals come to it and they say somethinghas happened. that gets categorized by a groupof volunteers and then placed on this map.


now why this is helpful is youcan start seeing clusters of information start appearingin a map like this. for example, as an aidorganization, i can come to a site like ushahidi and startseeing areas that are in need of immediate food aid. or i can see areas which haveoutbreaks of violence occurring so i can keepmy staff away from it. now it's important to know thatushahidi required you to be able to submit these reportsvia web, but not every


haitian had access to acomputer, especially after the earthquake. but everyone did havemobile phones. so a large coalition ofnonprofits, including ushahidi, frontlinesms, andinstedd worked to launch the 4636 project. now 4636 was a freesms shortcut. individuals on the ground couldsms a message to the short code that got picked upby a group of volunteers on


the back end. now we then again categorizeit and place it on a map. now this not only made mapslike ushahidi richer, it actually helped save lives. there's an example from haitiwhere individuals stuck in rubble would sms their messageson their location. they would then get picked up bythese volunteers who would place it on a map and relay thatmessage to the us coast guard or us southern command.


using this method, over ahundred lives were saved. now, this isn't to understatethe importance of technologies like radio and tv, but thefact is, the internet was having a key role in haiti. and we have some metricsto prove this. so google's landing page hadover four million page views. there were over 80,000 textmessages that were handled by project 4636. there were 2,500 volunteersaround the world managing


those messages. and there over a hundredlives saved. there were over 55,000 missingpersons records stored in person finder by the end ofthe haiti earthquake. and in addition, the americanred cross collected $32 million with an sms textcampaign in the us. this is the largest amountever collected by one organization for asingle campaign. finally thousands ofcontributions were made to


both open street map and googlemap maker, and as we said, open street map ended upbeing the most complete map of haiti's roads ever. and so a group of us decidedto travel to haiti, to actually understand thesituation on the ground and see where else technologycould have a key role. and i wanted to give twosmall examples from our trip down there. this is a doctor writing patientcare instructions on


the floor outside ofa patient's room. and this is a map of a fieldhospital's resources. now, these were areas where webelieve that technology really could help out. so upon returning back togoogle, we actually spoke to larry and sergey-- google's founders-- and we pitched thisidea of a team. a google.org crisisresponse team.


and it would be modeled verysimilar to other google.com project teams, with core engineering and product support. and our mission as a team wouldbe to make this crucial information more accessibleduring natural disasters and humanitarian crises. now larry and sergey were notonly supportive, they were extremely enthusiastic at thepossibilities that technology could have in this space.


we knew that we'd never bedomain experts here. but what we did know isthat we understood web technologies, and we wanted tobring that understanding to assist aid organizations andvictims after a disaster. so before i go into any moredetail, i wanted to invite my colleague steve up to stage totalk about how our team has evolved over the last year anda half, and how the crisis response ecosystemhas developed. steve.


steve hakusa: thanks prem. so prem just showed you a numberof internet-based tools that had a measurable impacton the response in haiti. we think this is justthe beginning. at google, we're excited than anumber of organizations see the haiti earthquake as thestart of crisis response 2.0, as a turning point in usingwidely accepted consumer technology to mitigate crises. google is looking to do muchmore in this space, working


closely with a number of otherorganizations that are doing similar great things. but this isn't enough. we also need your help. because crises aren'tgoing away. it's been-- we've responded atgoogle to more in the first quarter of this year than allof last year combined. now is the time to jumpin and get started. i want to give you an idea ofsome of the problems and


challenges that we see in thisarea, both for us and for developers like yourselves. and then show you how we'reusing openly available google technology and web standardsto be able to help. the first question you mighthave, though, is does the internet even workafter a disaster? and it might be surprising, butin every single crisis-- major crisis-- that we'veseen since haiti, the internet has stayed up.


and how do we know this? in three different ways. the first is local isps, whichcontinue to broadcast routes from their edge routers. we pick these up. and after a disaster, westill see them happen. the second is trafficto google sites from the affected areas. and third is the stories thatwe hear of people on the


ground using the internetas a critical tool for communication in the immediateaftermath of a disaster. the internet was designedto be resilient, to take advantage of robustness androute packets along the path of least resistance. we see people takingadvantage of that. haiti was a worst case scenario,an impoverished island with poor infrastructureand low internet penetration.


the earthquake actually severedthe only fiber cable coming into the country, leavingjust a microwave radio link to the dominicanrepublic. and with the scope ofthe disaster we couldn't hope for much. but fortunately, internet andtelecom companies were better prepared, and they survivedpretty much intact. this is a graph of trafficto google from haiti on january 12, 2010.


you can see that when theearthquake hit, there was a large power outage and trafficto google dropped. but you'll notice,importantly, it didn't drop to zero. the internet stayed up. we heard numerous stories ofpeople on the ground using smartphones, getting on facebookand twitter, telling friends and family that theywere ok, and providing some of the first on-the-ground newsreports of the level of


destruction there. because of the situation inhaiti, it took four months for the internet, to trafficto google, to return to pre-earthquake levels. but the internetwas always up. just a couple of weeks after thedisaster, we heard people in the peshionville golf coursecamp, that prem showed you pictures of before, watchingyoutube videos. the internet was there.


this is a similar graph inchile february 27, 2010. this graphs that accessa little different. each peak and valley representsa single day, traffic rises during the dayand then drops at night. the red line represents an 8.8magnitude earthquake that struck there in the middleof the night. there was a resulting tsunamithat hit a major network hub in santiago. we actually did stop seeingfrom the isps routes being


broadcast. but onlyfor 15 minutes. after that, the internet wasback up, and you see within a week, traffic had returnedto pre-earthquake levels. here's the situation fromnew zealand, after the christchurch earthquake justthis past february. you can see when the earthquakehit there was barely an effect oninternet traffic. we had conducted numerousinterviews with people on the ground in christchurch.


and they said that phones, afterthe earthquake, phones were saturated. they would not work. sms wouldn't be deliveredat all, or be delivered hours late. but the internet, theinternet was fine. people said email was the onlyway to communicate after the disaster in christchurch. vint serf, one of the foundersof the internet, says, "the


internet was designed witha significant degree of resilience built in. and i am pleased to know thatit has often served as an important platform forcommunication during emergencies." now vint alsowanted me to point out that this resilience shouldn't beconfused with the ability to resist determinedattacks, right? but i think the point here isclear, that the immense value that the internet brings incommunication and data sharing


also applies to disastersituations. so given that, the next questionto ask is, what problems are there? we look at that and say, whoare our users and what are their needs? we see three maingroups of users. first are those directlyaffected by the crisis. then their friends, families,neighbors, those indirectly affected.


and finally, the crisisresponders and aid workers who arrive to help. we're looking at the needs ofeach of these groups, and particularly wherethey overlap. for those directly affected bythe crisis, the most important thing is the status of theirfriends and family. getting access to informationin a crisis is challenging, specifically trustworthyinformation. they also want to know, how dothey tell other people that


they're ok? where can they go for help? and what's the status ofresources, like medical facilities, power,roads, public transportation, shelters? and finally, those directlyaffected by the crisis are often the first line ofcrisis responders. they want to be able to help,but i think technology can do a better job of telling themwhere they should be to help.


for those indirectly affected bythe crisis, again, the most important thing is what is thestatus of my friends and family that are affected? they also want more detailedinformation from the ground, reports from victims.what is happening? these people also wantto be able to help. they each often have uniqueknowledge of the language, the culture, the geographyof the area. you saw that open street mapexample, tapping into the


haitian diaspora to improvethe base maps in haiti. and we think there's a lot ofother ways to leverage this highly-motivated group ofindividuals using technology. and the needs of crisisresponders is actually where developer tools might help themost. most crisis responders don't have technologybackgrounds. but at the same time, they haveto be able to collect and analyze a lot of data tofigure out how best to distribute aid.


they need better tools tovisualize survey data, determining what otherresponders in the area are doing, and how they canshare data with them. figure out where their ownstaff and supplies are. it makes sense of the torrentsof crowd source and social media data that's coming in. i think there are bettertools that we can design to help these. you might already be thinking,what is someone from your


background? what are some of the thingsthat you could do? how could you build some toolsto help these people and help meet these needs? i want to walk you through someof the standards and some of the tools that ourteam is working on. you notice three common themesin the standards that i'm talking about. first, simple.


crises obviously are stressfulsituations. people aren't thinkingstraight. as much as possible, the toolsthat you work on and that get used in a crisis haveto be simple. and even better ifthey're familiar. they use things that peoplealready use in their everyday lives. second is standard. as much as possible, we wantto use and promote common


standards to enable differentorganizations to inter-operate. finally open, open datalicenses, open apis, open source code, open systems reallywin in this space. person finder is a great exampleof these three themes. it also meets one of the keyneeds that we saw from multiple user groups:finding the status of friends and family. now person finder is an appengine application, but more


important is the data standardthat underlies it, person finder interchangeformat, or pfif. pfif was created after hurricanekatrina by ka-ping yee, one of google.org'sfirst engineers. and again, as prem mentioned,the major design principle behind pfif was convergence,bringing data together from multiple different sourcesand systems. for something like this to work,data has to be traceable back to its original sourcebecause data might have


unknown reliabilityor accountability. pfif also assumes that there'sno central authority, so if you're an aggregator of thisdata, you can decide which data sources to trust. but in a system like this,duplicates are inevitable. you're going to have missingperson records for the same person. so the format has to be ableto handle that and resolve duplicates.


the data model for personfinder is very simple. there's just two typesof records. first is, a person record. this is entered by someone who'slooking for a missing person, or you can actuallyenter a person record for yourself if you want otherpeople to know that you're ok. then there's a note record. and this is entered by anyoneelse that has a status update about that person.


there can be multiple noterecords for every person record, and note records arealso used to resolve duplicates between differentpersons. person finder has a data api. it's built on xml,rest, and atom. simple powerful standardsi'm sure many of you are familiar with. and it is used in manygoogle tools. the search api allows you toenter all or part of a


person's name and get back theperson and note records for that person. there's a read api whereyou can enter the identifier for a person. or get an atom feed of allthe person records or all the note records. and then there's a write api. you can start person recordsand note records as xml and https posts them, along withan authorization token.


any of you can try out this apiright now by going to our test instance atgooglepersonfinder.appspot.com. the user interface to personfinder is fast and simple. if you're searching forsomeone, just say i'm searching for someone. you enter their name. you'll get all the records thatmatch this person's name. find the correct record. on the left-hand side you'llsee the person record.


on the right-hand side you'llsee all the note records for you can also enter a new record,a new note, if you have a status updatefor that person. person finder is an open sourceproject hosted at code.google.com at googleperson finder. if you'd like to help, you cango to the site, check out the code, play with the api, ormaybe pick up an issue from our issues list. there's alsoa mailing list that's available if you haveany questions.


the next set of user needs thatwe're looking at is the status of resources. and in haiti, we saw a numberof different organizations trying to collect thisinformation. the us army was surveyingthe status of roads. the world health organizationwas trying to figure out which hospitals were open. and the red cross was trackingthe status of what shelters were being built and where.


this information was usefulto crisis responders. and we found that they weresharing it by passing email and emailing each otherspreadsheets. this method, crisis respondersspent a lot of time merging rows together, copying andpasting between spreadsheets, doing a lot of things thata database could do. not only is this inefficient,but it doesn't keep up with the rapid pace of changein a crisis situation. so what if we couldautomate this?


both the merging of data inspreadsheets and also the exchange of data betweenorganizations. we can make it a lot fasterand simpler for people to understand the current statusof resources in a disaster. we envision a project calledresource finder. the goal to enable structuredupdates, information about resources in near real-timefrom multiple sources. so what's the simplest possibleway to do this? we start with multiple sources,each with information


about one specifictype of resource. if anyone wants to addsomething, remove something, or change something, it wouldpublish that change. the change could then bebroadcast to anyone who's interested in receiving it. now for this to work, everyoneneeds to have the same identifier for a resource,call that a record. and users will also want toknow which source made the change, right?


so you also have to tag eachchange with an author. finally, these changes may comeout of order, so they also need a time stamp. [unintelligible] for this change, you just knowwhat changed, who changed it, and when did it change? imagine a stream ofthese changes. if you have that, you shouldbe able to just merge them together to create the currentstatus of all the resources.


you take a specific exampleof hospital facilities. say the army again, from asurvey they did in march, and it was the status ofthe road access. the red cross knows whichhospitals are open and how many beds are available from asurvey they did last month. let's say maybe a specificdoctor knows there was a road washed out yesterday,and now there are more patients available. using the authored informationand the time send information,


you can merge all the columnstogether and merge all the rows together toget the current status of all the resources. and imagine multipleapplications publishing and subscribing to this information,using an open protocol like pubsubhubbub,which was announced at google io last year. some of these subscribingapplications could send email or sms when a hospitalis out of beds.


or maybe it's just an onlinespreadsheet that your organization wants tokeep up-to-date. maybe it's an instance ofushahidi that i want to keep up-to-date. we built a project calledresource finder on app engine. and we focused initially onhospital facilities in haiti. it turns out this actuallywasn't that well adopted. organizations liked receivingthis information in near real-time, but they didn'twant to have


to learn a new tool. so we're rethinkingour work here. we're trying to extract thecore ideas from research finder into a protocol that wecan integrate into tools that are already being usedon the ground. we're calling this tablecast. the name comes from the factthat most data is in tables, and we want to be ableto broadcast it. it's an atom extension.


it represents a stream ofchanges to a data set. and since it's atom, it alsointegrates very well with a pubsub protocol likepubsubhubbub. and just like before, itrepresents what changed to what record, who changed it,and when did it change. we're still actively developingthis application. we're looking to build librariesand tools to integrate this intowhat's already being used on the ground.


if you're interested in thisspecification, or want to participate in the workinggroup, you can check out tablecast.org. the next group of user needswe're looking at is their access to public alertsand warnings. the problem we see here is thatmany organizations that produce these alerts use theirown format and their own distribution mechanisms.some have publish alerts to their website.


some have a twitter feed,or an rss feed. some offer emailsubscriptions. some don't have informationlike this online at all. it would obviously be muchbetter if all this information was online in a standardformat. it turns out, there is sucha format called the common alerting protocol. it's an xml specification thatnormalizes formats across many different types ofalert messages.


it offers flexible targeting. things like language, category,geographic area. cap was developed in conjunctionwith over a hundred emergency managers. and it's been adopted as aninternational standard by oasis and the itu. organizations like the nationalweather service and us geographic survey alreadyproduce alerts-- feeds of alerts--


in the cap format. we're working with many otherorganizations to standardize their alerts as cap. and we've open sourced a javalibrary and some tools to help get alerts into thecap format. we're also trying to standardizehow alerts are distributed across the web. we're encouraging providers ofalerts to produce atom feeds we're going to aggregate themat a common instance of


pubsubhubbub that we'recalling alert hub. the final standard i wantto mention is kml. i'm sure many of you arefamiliar with it. our goal here is to getinformation related to a disaster on the web in openformats before the next disaster strikes. one tool that we found to beparticularly helpful with this is google's fusion tables. there's a session in this roomat 1:15 about fusion tables


that i would encourageyou all to attend. fusion tables have someawesome features. there's two things that we likeabout it the most. the first is that it'sreally simple. if you have a spreadsheet andit has a column for latitude and a column for longitude. or it just has one columnfor addresses. you just upload that spreadsheetto fusion tables, and you immediately havea map that you can


share across the web. the next thing withfusion tables we like is that it scales. it scales to tens of thousands,hundreds of thousands of rowsin your data. and it also scales in terms ofqueries per second for people accessing your data. we see this kind of trafficon our landing pages. and we've had situations wherewe haven't been able to link


to data on somebody else'swebsite, even if it was really good. because we were afraid ofbringing that site down. if the data is on fusion tables,we don't have to worry about that. fusion tables also worksreally well with the google maps api. this code's actually from one ofthe pages that we launched responding to a crisis.


you can see it's actuallyreally easy. in fusion tables, the url hasan id for a fusion table. and inside the maps api, there'ssomething called a fusion table layer. it's a first-classcitizen there. you just plug in the id tocreate a fusion table layer, set the map to that layer, andyou automatically can render thousands of points,server side, just like google maps does.


this also works really wellfor polygon data. so you can display heat mapsand things like that. fusion tables is reallypowerful and really well-integrated intoour tools. so to sum up some of thestandards that we've been person finder interchangeformat for missing person's data. table caps for enablingreal-time updates to tabular data.


the common alertingprotocol for public alerts and warnings. and kml for geographic data. we want to keep things simple. we want to promotethese standards. and we want to promoteopenness. open apis, open data,open source code. open systems win. [applause]


prem ramaswami: thanks steve. so steve just gave us a deepdive into a lot of the tools we've been working on overthe last year and a half. and what i wanted to talk aboutwas how these were all very useful during a recentcrisis response. on march 11 of this last year,there was an 8.9 magnitude earthquake that strucknorthern japan in their sendai province. now this earthquake was so largethat the earth literally


moved four inches on its axis. and the island of japan movedsix feet closer to california. we immediately sprung intoaction, wanted to collect imagery over the affectedregion. the video you're seeing playing,the top part is imagery from 2009. the imagery at the bottom isaerial imagery we collected with a fleet of planes. now this imagery had enoughresolution to


see individual cars. but more importantly, it showedus the level of damage and destruction thathad been created. entire cities were completelywiped off the map. now seeing this levelof devastation, we had a real question. would the internet beable to withstand a disaster of this magnitude? and so we turn to our trafficgraphs again.


now you can see a slight dipin traffic occur in the immediate time ofthe earthquake. but this rebounds back topre-earthquake levels almost that day itself. and it wasn't just graphs. there were stories from theground that told us this. this is an open letter of thankswe received from ruth shiraishi, a google userin tokyo, japan. and ruth told us that on march11, our technology helped


change her life. so in her words, at 2:46 pmwhen the earthquake struck japan, our phones wouldnot connect. our desktop email systemswould not go through. our phones sms would not work. but gmail connected. gmail connected us with eachother in the company, to our loved ones around tokyo, andworried family and friends around the world.


and it wasn't just gmail. it was other internet sites. it was twitter. it was facebook. it was the power of the internetto connect people in the aftermath. now when you people were goingonline to find information, and we knew there was also atsunami alert that had been issued for most of the pacificrim countries, and so google


decided to use google.com-- which gets billions ofeyeballs every day-- as an alerting platform. so we launched this line of textthat was a tsunami alert in four different languages-- spanish, english, japanese,and russian-- to warn users that a tsunami wasapproaching their areas. now we said that we knew peoplecome online to find this information.


i wanted to provethat with data. so these are our search queriesfor the specific term tsunami coming from the islandof hawaii over the last year and a half. and you see two very distinctpeaks in this graph. the first one occurs on february27, after the chile earthquake and subsequenttsunami warning. and the second one occurs onmarch 11, after the japan zoom in on march 11.


so again, these are specificqueries related to tsunami coming from the islandof hawaii. at around 7:56 pm, the pacifictsunami warning center issues a tsunami warning. and you start seeing a slightuptake in queries, which turns into an extremely large peak. now this dies down as peoplefall asleep, but you see a secondary peak occuraround 3:00 am. and the reason this is importantis because at 3:11


am the first tsunamiwave hits the westernmost island of kauai. so people are coming online tofind information and see how this affects them. now this peak dies down againat 4:08 am after the tsunami wave leaves hawaii and startsheading towards the west coast of the us. now this peak represents over20% of queries from the island of hawaii, which is an extremelylarge number.


and it's not just graphsthat tell the story. this is a reddit post that saidi don't know if this is new, but this is agoogle feature i can really get behind. and there's a comment on thebottom from a user whose mother lives in hawaii. and they note that, because theyopened their browser in the morning, they were awarethat there was a tsunami alert issued.


now steve pointed out one ofthe key needs from people after a disaster isgetting in touch with friends and relatives. and so we once again launchedgoogle person finder. now over the course the lastyear and a half, the tool has been pre-translated in40 plus languages. so we were able to launch it injapanese within hours after over the course of the first fewweeks, person finder ended up aggregating over 600,000missing persons records.


a partner with majororganizations like nhk, the national broadcaster in japan,as well government bodies like the japanese national police. in the first seven days alone,we were able to push 22 different releasesto person finder. and this included features suchas the ability to search in local japanesecharacter sets. the ability to be compatiblewith the local tier two phones in japan, as well as featureslike being able to subscribe


to a specific record. and this is because app enginegives you this great ability to rapidly prototype. and web applications also allowyou to quickly make these changes on live tools. this is actually our app enginedashboard a few hours after the tool was launched. and what you're seeingis a peak in users coming to the site.


this actually peaksat a little below 1,500 queries per second. in terms of analytics, that'sabout 36 million page views in the first two days alone. now what's important here isapp engine can elastically grow to support this peak. our engineers didn'tbuild a tool that could scale like this. and the fact is, crises havetools that require this sort


of-- that need this sortof elastic capability. the fact is the tools oftenlike dormant for a large amount of time untilthe crisis occurs. we again launched the landingpage to aggregate information like person finder, donations,relevant maps, as well as real-time user generatedcontent. but we also had a learninginformation on this page, such as rolling power outages thatwere occurring throughout japan as well as transitoutages.


now, i wanted to point out thatone of the sites we link to from this site-- again, it'snot just google, it was sinsai.info. and this is the localushahidi platform. again ushahidi was that incidentreporting platform. what's interesting, though, isushahidi was launched by a group of local volunteerdevelopers. because the tool was simple touse and open source, local developers could pick it up,translate it, and launch it


for the japanese public. and we also wantedto make other mapping information available. and so what we did was wedecided we wanted to put shelter information on a commonmap, as one example. now shelter information wasstrewn across numerous government websites, stategovernment websites in formats such as pdf. so we scraped this information,pushed it to a


fusion table, and then made itavailable in one location. now it's really important to seeinformation like shelter information on a map so youknow which shelter is geographically closest to you. there was something elseinteresting that was occurring, though, atthese shelters. people were maintaining writtenlists of people who had come to the shelters, andposting it on common walls at these shelters.


now this was essentially theirversion of person finder, and it was again siloedper shelter. and we wanted to be able tosync this up to the online person finder tool. so a group of engineers andproduct managers in our office in japan decided to getvolunteers to go to the shelters, take pictures of thelists, and upload them to a simple-to-use commonpicasa album. and then we had volunteerscome and transcribe these


specific photographs and thenames on them, and post them to the comments, whichwe then pushed back into person finder. over the course of a few weeks,we ended up processing over 9,000 photographs andupdating 137,000 names in person finder. this was possible because wehad thousands of volunteers from outside of google whowere able to help us out. and we were able to managethat queue by building a


simple tool that handed outtasks to them over app engine. again, i said this was possiblebecause we had this great team of product managersand engineers in tokyo who worked tirelesslyfor three weeks. this is them taking a pizzabreak a few days after the and it was really inspirationalwork. they had over 40 launchesthat they did. but at the end of the day, anyof you as developers could've taken part.


because most of the work thatthey did was simple via standards and opensource tools. but we'll get back to thatpoint in a second. i wanted to point out whatis it that google is looking to do next. so this is an image from googlemaps of me searching for tsunami hawaii in theaftermath of the earthquake. and you can see the first resultis the pacific tsunami museum incorporated.


not very useful. so we want to be able to getthis alerting information across all the properties on theweb that people go to to find information. this is that same reddit post.and this is a comment at the bottom of someone saying thatthey didn't see the alert on their mobile phone. why didn't their mobile phonewake them up if they were asleep on the beach?


our landing page requiredcontributions from dozens of googlers around the globeto get all this alerting information online andin a common format. how can we get this informationin advance of the disaster so that we need nothave to do all the work after the disaster to getit out there? and so this is where i wantedto pose a challenge to the audience out there. at the end of the day, you'reall the developers with the


great understanding of webtools, whether that be mobile, the maps api, fusion tables,or even app engine. and you really do have anability here to take part and make a difference. and i wanted to give a smallstory of two web developers in pakistan. so these two guys were a littleannoyed that india had better base maps than pakistandid on google maps, and they decided they were going todo something about it.


so they got a bunch of friendstogether, and they said we're going to start mappingout pakistan, and we'll start with lahore. now the time lapse photographybehind me gives you an idea of how this map grows out startingin june 2008. it gets filled with volunteersadd data. this is completely crowd-sourceddata collection. now these two developers endedup being the top google map maker mappers in the world.


but why this is important islast august, pakistan was inundated with floodsthat covered over 20% of the country. and google map maker had betterbase maps than the pakistani military did. we were able to share thisdata with [? unicap ?], the un's mapping organization,which was then able to help aid organizations deliver aidfrom point a to point b. these are two web developerswith an idea to get friends


together that ended upsaving tons of lives. and it's not just googlein this space. you can actually join the crisismappers network, which is a group of volunteers theyget together and map out areas in need of assistance. we talked about howuseful open street map was after haiti. there are also specifictech organizations. we talked about ushahidi,instedd, and frontlinesms. all


three of these organizationshave open source code bases and very large issue trackersthat could use developers like you to help them out. and to get your feet wet, youcan always join a local crisis camp that's held by crisiscommons all around cities now these crisis camps bringtogether aid organizations as well as developers like yourselfto discuss problems and possible solutions. google, microsoft, the worldbank, yahoo, and nasa all run


the random hacks ofkindness together. and so rhok is a conferencethat's a 72-hour global hack-a-thon that run in citiesacross the globe that look to bring subject matter experts aswell as aid organizations in the same location to pitchproblems to have developers rapidly prototype andcreate solutions. now, don't worry, 72 hours,there will be a lot of pizza and red bull also, there. and we actually have rhok #3running on june 4 and 5.


this is rhok #3. it's actually the fourth rhok,but as every good geek knows, you start counting from zero,so that's why it's rhok #3. i wanted to give an examplethough of where a rhok hack was super useful. so there's this hack calledchasm that came out of rhok, and it's to identify areas thatare prone to landslides. there was this huge equationthat a world bank engineer had.


and you either needed to havea phd in math or a phd in geology to actually understandhow to use it. now what a bunch of developersdid was, they took that equation and they createda simple web application around it. so aid workers could actuallyput this information in and know where to dig trenches toavoid future landslides. it's a great exampleof something you can do in 72 hours.


finally, there a lot of non techorganizations out there though that coulduse your help. and all crises are local. i would encourage you to go talkto your local red cross, your local fire department,or your local sheriff's department and see where theycould utilize your help. understand their processes. help them switch from usingpen and pencil to using internet technologies to solvesome of their problems.


maybe you could help get localshelter information into fusion tables, local evacuationroute data into kml before the disasteractually strikes. maybe you can help your localsheriff's department send out alerts using a format like thecommon alerting protocol. i wanted to conclude by givingyou one final example of helping out in alocal disaster. first responder is a tool thathelps fire chiefs understand the location of their staffbefore, during, and after a


fire response. now it was built by john riley,who's a volunteer firefighter. and john definitely doesn'tconsider himself a web developer, but he's dangerousenough to code. so what john did was, he builta tool that uses google voice to accept incoming calls, thenuses google latitude to track the location of allthe firefighters. it uses the google maps apito give them turn by turn


directions. it uses app engine to logwhen everyone shows up. and it also actually makessure that everyone is accounted for afterthe fire response. it's this great example of usingsimple, standard, and open to solve a very importantlocal problem. first responder is actually anopen source tool that you can find on code.google.com. so i wanted to conclude byconveying to you my real


excitement in this space, andthe possibilities that technology has here. i really believe we are atthis turning point where technology-- widely accepted consumertechnologies-- can really help and alleviatesuffering after a natural disaster. i think that all of us, whetherwe're geeks, or developers, or citizens justinterested in doing good,


really have a chance to makea difference here. i wanted to thank youall for your time. steve and i will be here forthe next 10 minutes if you have any questions. there are two mics set upon the right and left. we'd be happy to answer them. thanks again. prem ramaswami: no questions? audience: hello.


i was wondering if you couldjust talk a little bit about how you sort of alluded tothings like ushahidi and other organizations. talk a little bit aboutverification. you know you have this greatcrowd source data. you mentioned the openstreet maps in haiti. all of this stuff, you know, i'mfrom a media organization, and the issue of how do youknow how reliable this information is, you know,at different stages.


you have the source and theoutput, but translating that, just, how do you guysapproach that? prem ramaswami: so thisis a great question. so the question is basically,how do you make sure that your sources are correctand verifiable? and the fact is, this isan open area right now. so ushahidi for example tries toget more than one record so that they can verify thatsomething is happening there. open street map and map makeruse a version of moderation so


other people with localknowledge can say yeah, this is probably right. now what's interesting is thoughthe power of the crowd is really able to reduce theamount of spam or false reports that you seen ina lot of these cases. another good thing is, humanstend to be good in these situations and you don't seemany of these cases. but they do exist. and i thinkthis is also a case where local aid organizations doplay an important role of


providing that verification,to make sure what's being provided is truthful. audience: this is ashort followup. i'm wondering if you automatethe threshold for then publicizing information. in other words, you saidushahidi does multiple things. like at what stage do you say,ok, and is that programmatic? you know, how do you define, youknow, sort of in advance of it, of how do you find, okwhat's our threshold for


saying, we think this is a validentry, we think this is not a valid entry? prem ramaswami: so youcan use a lot of signals from the internet. so there's actually this greatstory of this group from them mit that won the red balloonscompetition that was posted by darpa. and this was to find redballoons that showed up randomly around the country.


and they used things like thelocation of where the information was coming in, tosee whether, you know, the guy who says he knows where theballoon as in california was actually sending it fromoklahoma and was probably lying to them or not. so you can use signals likethis, and i think each organization uses differentlevels of signals and a different threshold on wherethey consider it ok to publish this.


now i should point out that,in the case of publishing things like alerting data,that we're interested in doing, we want to make sure thatcomes from an authorized source-- a government agencyor someone who actually has knowledge of the situation. and that we're just adissemination channel for that information. thanks. audience: it seems like yourtools are after the fact.


what are you doing before thefact, in terms of having people preregister, or afacebook app where you can register yourself with thegoogle database so when a disaster does hit your area,your friends know how to reach you and know that they havethe right record? prem ramaswami: so there is alot of work done by different groups on things like makingsure you have an in case of emergency contact in yourphones, knowing what to do, or pre-education inthe aftermath--


before a crisis actuallyhappens, like what to do when an earthquake happens,things like that. one of the things that we'redoing specifically as google is we're trying to get alerting information out there. so a key example of this wouldbe the tornadoes that hit at night in the southern partof the us recently. we want to be able to givealerts to people who are in that region who mightbe asleep.


maybe we can eventually wakethem up to let them know that they should head totheir basement. so these are things that wewant to try to do the pass this information out in easierformat so people can read them regardless of whatthey're using. audience: do we know the name ofthose pakistani developers? prem ramaswami: i do, buti'm forgetting them. we can follow up laterthough, and i can try to get them to you.


audience: you mostly talkedabout natural disasters until now. can you relate to humanitarianissues, or things like political? the issues we've seen and what'sgoogle's position on what you guys are doingfor things like this? prem ramaswami: so that'sa great question. the fact is that our teamis focusing right now on natural disasters.


there have been a lot of themlately, and it's keeping us pretty busy. the fact is, we have left thedoor open though to work with humanitarian crises also. we have done some work in thepast, for example, with map maker, to map out the southernsudanese region. there is a program withgeorge clooney called paparazzi in the sky. the idea was have a satelliteover the area to prevent


future violence. so we're looking atareas like that. we think right now most of ourtools, though, do focus on natural disasters. audience: i was wondering if youguys have reached out to the wide land firefightingorganizations. in my experience, theyconstantly deal with these kind of disasters and havesome pretty good tools to support them.


prem ramaswami: so actuallythis is a very important area, too. and so i assume we're talkingabout the wide forest fires and the groups thatfight those. so we do have some tools thatthey use using google earth, for example, to be able to modelout wind patterns and how the fires are moving andso that they can use this without needing an internetconnection as they're literally jumping out ofhelicopters and things like


that to fight these fires. and so we have some specifiedtools for these different groups. the tools that we're building,though, as a team, tend to more focus on general consumerswho are interested in finding information andmaking sure that information is available to them. audience: just kind of goingback to you had talked the responders as a group thatdidn't have great uptake


during haiti. and i know, as responders, theyhave a bunch of tools that would probably give reallygood requirements. so, something tomaybe look at. prem ramaswami: definitely. we definitely will. audience: have you beenplaying with the-- the timescale of a model thatyou're working on, like in crisis, everything'sup in the air.


and everybody needs tofix it in a hurry. and when the crisis is over,do you get to throw away all the data? or you don't haveto worry about curating it quite as much. but if you spread out thetimeline, any curation becomes a bit more important. and the needs becomea bit more diverse. so like, chaos mapping, instead


of just crisis mapping. and also have you gottena lot of call to participate in war zones? prem ramaswami: so, going backto the war zones issues. so we have been asked to helpout numerous times. right now we're not expertsin those areas. and we don't really understandthem as well. so as i said, we're focusing on natural disasters currently.


with your previous questionabout wider mapping, i think a lot of the tools that we arebuilding are applicable in cases that are general crisesor in broader context. we might be focusing just onthis, but we try to keep kind of the vision a little broader,so mapping is mapping at the end of the day. audience: so i just was sittinghere, and it was real interesting that there were somany people in this session. and wondered, and maybe i'mmaking some assumptions that


we all kind of care aboutpeople and we care about helping people in crisis, isthere kind of a general group where-- because i was thinkingit would be awesome to be able to exchange business cards withall of you, or make some sort of contact or connectionwhere we'd collaborate together generally. not about a specific thing likethe people finder or the alerts, but in general? [inaudible]


prem ramaswami: so letme repeat that on the mic for everyone. crisis commons is thegroup to join. so crisis commons atgooglegroups.com. it's a great group. i mentioned them earlier. they're the ones who host thecrisis camps in cities. but it's a great mailing listto join to see, for example, people, state organizationsmight email that group and say


i have all this data onan excel spreadsheet. can you help me putit on a map? steve hakusa: crisis mappersis also another group that we mentioned. if you have specific mappingexpertise, or gif expertise, they're also another greatgroup to join. audience: these are some reallyawesome tools you guys put together. i'm a developer for anews organization.


when an event like an earthquaketakes place, is there a particular place we canlink to so i can tell the editorial team whereto go, just directly to person finder? or do you guys actually have togo create an event to say this earthquake has happened,for example. prem ramaswami: so let me talkabout the specific events. so usually, if the event isbig enough, if you go to google.com, you should findthe information there.


and so you should be ableto search for it on our properties or you'll see alittle link on google.com if it's a large event. the other place you can go isgoogle.org/crisisresponse. is our site and we have recentresponses that are listed there, so you can kind ofget an idea of some of the stuff we do. steve hakusa: we'llalso be tweeting anytime we do a big response.


we'll definitely get it outthere [unintelligible] like what we're doing. prem ramaswami: and this isreally interesting, because the last time, when we launchedperson finder, we tweeted about it. we tweeted about it a fewminutes after it launched. and we just saw the qps spike. and i mean, i guess i shouldknow the power of twitter because i work at google.


but i was again shocked tosee just how many people picked up on it. before we even had a landingpage aggregated and ready to push, people wereusing the tool. audience: so with maps,there's a lot of ip involved with it. how do you guys deal withthat during the crisis? so, for example, the pakistanipeople have made the great base map that wasn't pushedout into the community


afterwards. it kind of just livesin google now. and for example, probably atosm, you guys didn't get that data, because itwas encumbered. so i deal with it every day. how do you guys deal withnonprofit organization? very carefully is probablythe solution. prem ramaswami: yes, verycarefully is a good answer. but i want to give you a morehonest answer to that which is


that we're always lookingto get better. and so this is definitelya place where i think we can get better. now, in addition to that, iwanted to add though that this data is available over themaps api, for example. and that when there is adisaster in a region, we do provide this data toorganizations like [? unicap ?] so they can can make itavailable to all aid


and so we'll continue to work onthis, and we'll continue to see if we can helpin this area. audience: just to followup, you guys probably know this as well. but you guys have been pushingthe aggregation of data as one of your goals. and when it comes to mappingdata, it's kind of frustrating because you can't aggregatethat data because someone might have--


osm might have a road thatsomeone got lazy and didn't put it into map makerand vice versa. there's no answer, correct? prem ramaswami: we'revery aware of this. this is somewhere where we wantto be able to help out. audience: maybe the engineercan answer this, but you've got something that yourtable data might be able to use as non-- steve hakusa: table caps?


yeah that's actually probablya really good example. you can try to get thesedifferent organizations to be sharing these updates betweenthese groups. yeah, that's definitely. audience: middle-aged men likemyself will have a heart attack and get very interestedin their health because the crisis is acute. you go to the hospital. you go on the cholesterolreducing drugs.


but what about the chroniccrises, like ocean acidification, low-oxygen zones,over-fishing, things that have the potential tokill billions of people? there are certain scientificorganizations-- i'm one of them-- we sit onpetabytes of data that could be used in front of crises ina more offensive position, rather than defensive, waitingfor something to happen. is your team at all interestedin those types of problems? prem ramaswami: so there'sactually a great google.org


project called earth engine. and what earth engine is tryingto do is take these petabytes of data regardingthe globes and push them together and get google'scomputing power to be able to conduct algorithms over. and one of the examples is oneof these chronic crises, which is deforestation. so using this globe of satelliteimagery from the last 30 or 40 years, you canactually see deforestation


occur in different areas. so there are differentgoogle.org teams that focus in these different areas. i wanted to thank everyonefor their time today. we're unfortunatelyout of time. steve and i will be around, kindof off stage for a bit, if you have any followupquestions. you can feel free to find us. thanks again for comingout today.


steve hakusa: thanks a lot.







Just got my check for $500 

Sometimes people don't believe me when I tell them about how much you can make taking paid surveys online...

 So I took a video of myself actually getting paid $500 for paid surveys to finally set the record straight.


   

Comments

Popular posts from this blog

complete a survey below to continue download

welcome to the video session on working withdhs datasets. this video series is divided into three parts. part 1 will be discussinghow to register for access to dhs datasets online. part 2 of this video series will discusshow to download datasets from the measure dhs website and finally part 3 will demonstratehow to open dhs datasets in the software programs stata, spss and sas.this video will be demonstrating part 2 of the video series: how to download datasetsfrom the website. if you have not yet registered for dhs data, you should return to part 1of the series before continuing forward with this video.usually within 24-48 hours of requesting data access you should receive an email from archive@measuredhs.com.this email is very detailed email containing instructions for how to download data fromthe website. please read the full email as it contains helpful links to the dhs recodemanual and to other resources that will help you in analyzing your data. for the purposesof this demonstrat

best survey sites that pay cash in pakistan

in this video, i'm gonna share with you howi get free stuff on amazon and my five-step system for how i actually earn money in the process, coming up. (shutters clicking) hey, what's up, guys. sean here with think media tv, helping you go further faster in media. on this channel we do tech gear reviews, video gear reviews, and tips and strategyvideos just like this one. if you're new here, consider subscribing. hey, at any point during the video, i'm gonna list out a lot of websites and all of my resources and things. i'll put those in theyoutube description below so you can check those out at any time. let's get into the video. lately i've been getting a ton of free and heavily discountedproducts from amazon. i've gotten a few tech things, some iphone lenses, some things like that. i've gotten a ton of supplements, and hang on for a second. i've gotten literally a ton of supplements from amazon. i've kind of beenobsessed with them lately. i&

can you really make money with survey club

hello everybody it's andy am here againand today's review is going to be on surveyjunkie legit and and i know that you're looking for you notify noted survey jackie is rightfor you if it's going to be worth your while and to be able to turn and nice good jobonline make money at from home well i signed upfor survey junkie scam some actual user and i wanna give you buy at biased anhonest review at survey check am for stuff etiquette thing about serving kentuckyis that it's free just like the state is there is actuallyno cost inc so that is getting tripped up and a bad thing now and yes you can earn money bycompleting giving your opinion for surveys problems are is that you are limited i lined how many you can actually you a tell you center for as many surveysyou possibly can and the problem is you signed up for3456 can't surveys and and then when you hash you get hiredto complete do them you get me very very little ican't acting like a dollar he can finish one