Wednesday, July 17, 2019

Answer coming next week!


While travel in French Polynesia is great... 

... the wifi is rather slow and unpredictable.  As a side-effect, I haven't been able to get a good connection for a long enough period of time to actually do the online research needed to get answers to last week's Challenge.  

A drone's eye view of the harbor at Huahine. Note the fringing reef and coral sands all around. 
These waters have many parrotfish! 


SO...  I'm going back to diving in these beautiful waters and hiking on the mountains.  I should be back in the land of fast connectivity by Saturday. So I'll post my answers in one week.  

Until then,

Search on! 


Thursday, July 11, 2019

SearchResearch Challenge (7/7/2019): A couple of questions about Polynesia! (Why so long? What are those clear patches?)


I managed to find wifi! 

As I mentioned last week, I'm touring through French Polynesia for the next two weeks.  It's kind of a long way to go, but it's completely worth it.  Lots of long stretches as we sail from one island to the next.  Many of these are coral atolls, and look a bit like this as we sail by.  They're all low-slung, just barely out of the water.  You wonder how they survive when a big storm comes through.  

Rangiroa seen from the sea.


Or like this, from a satellite image:  



As I've said before, traveling is an endless source of SRS questions. Here, in this place, there are SO many things I've had to look up--my SRS skills are getting a great workout!  What kind of tree is that?  Does the nut from that tree really have fish-stupefying properties?  Really? 

Many of the things I've been seeing need a bit of research to help me understand what I'm seeing.  

This week I've got two Challenges, but on the next cycle, I'll add two more.  But for today, let's start with one slightly difficult Challenge, and a simpler one.  

1.  In researching the dates of initial colonization of Polynesian islands, I noticed a VERY strange incongruity.  Look at the map below.  The blue pins are all island nations that were first colonized around 1000AD.  The red pins (to the left of the long green line) were all colonized around 1000BCE or before.  What happened here between 1000BCE and 1000AD?  Why are the all of the blue pins MUCH later than the red pinned locations?  It's not that far from Samoa to Niue, why didn't anyone colonize that island until 900AD or so?  Generally--why didn't the Polynesians go beyond the green line for a very long time?  





2.  As we're sailing from place to place, it's not uncommon to see large patches of water without any ripples on the surface.  It's something you see nearly everywhere--it's a common effect on lakes, ponds, and oceans.  But what causes these ripple-free regions on the water?   (See below for an image that has a large Y-shaped blank area in the middle. What causes this?)  



As always, be sure to tell us not JUST the answer, but how you figured it out!  What searches worked for you, and if you spend a lot of time on a rathole that doesn't work out, be sure to leave us a comment to that effect.  We can learn a lot from strategies that don't work out.  

Search on!  


Saturday, July 6, 2019

Answer (Part 2): What DO we know about history, math, and geography?

Last time we talked about history...  
Now, let's talk about math and geography and how much people know about each.   


In this excerpt from Raphael's fresco The School of Athens, Pythagoras is shown writing in a book as a young man presents him with a tablet.  It shows a diagrammatic representation of a lyre above a drawing of the sacred tetractys.  Do you know Pythagoras and his contributions to mathematics? Do you know why the lyre is significant here? 

1.  Can you find a high-quality (that is, very credible) study of how well the citizens of the United States (or your country, if you're from somewhere else) understand (A) history, (B) geography, (C) mathematics?  

Let's try repeating what we did last time for history.  

     [ American knowledge of math ] 


And we see a similar result: 


Here you go: Lots of results telling us that Americans are terrible at math.  Once again I'll open up the top 10 results in parallel browsing and take a look. 

Even the New York Times has an article with the headline, "Why do Americans Stink at Math?" from 2014.  Although it's a compellingly dismal story about American's ability to do math and why the education system isn't working.  But it refers to the results of studies (but does not give any citation) for their data.  

We must dig deeper, looking at the articles AND who publishes each, AND where they get their data from.  

1. US News and World Report "Why students are bad at math" -- points us to the 2017 National Assessment of Education Progress.  We've seen this data source before in our previous post.  This org is also called NAEP and this report is called "The Nation's Report Card," and summarizes the results of testing across a wide spectrum of US schools for grades 4, 8 and 12.  (I'm alway encouraged by a data source when you can download the data for yourself.  Open data is a sign of a reputable organization, one that's willing to let you look at the raw data source.)  Here's their 2017 Math data set in PDF form.  Here's the top line of that report.  (If you're interested, it's worth looking through the data for all of the metadata about their testing methods, and all of the data exceptions--which all data sets have, but which give me confidence that they took good care collecting this data.)  

Click to see this figure at full-size.  

Summary of this data?  There's been a huge drop in math test scores between 1991 and 2017 almost across the board for grades 4 and 8.  

2.  The Quartz.com article "Americans are spectacularly bad at answering even the most basic math questions." is another dismal headline.  This article points to the PISA studies done by the OECD (Organisation for Economic Co-operation and Development).  As they say on their website, "PISA is the OECD's Programme for International Student Assessment. Every three years it tests 15-year-old students from all over the world in reading, mathematics and science. The tests are designed to gauge how well the students master key subjects in order to be prepared for real-life situations in the adult world."  

This is an interesting comparison source that I hadn't thought about:  How can we measure one country's math understanding?  By comparing test scores with other countries!  

What this this test show?  

"Shanghai-China has the highest scores in mathematics, with a mean score of 613 points – 119 points, or the equivalent of nearly three years of schooling, above the OECD average. Singapore, Hong Kong-China, Chinese Taipei, Korea, Macao-China, Japan, Liechtenstein, Switzerland and the Netherlands, in descending order of their scores, round out the top ten performers in mathematics..."  

Uh oh, this means the US isn't even in the top 10.  Where are we?  You can look at their test data overview here.  And this is the key chart... 

Click to see full size.  


As the overview reports: 

Among the 34 OECD countries, the United States performed below average in mathematics in 2012 and is ranked 27th (this is the best estimate, although the rank could be between 23 and 29 due to sampling and measurement error). Performance in reading and science are both close to the OECD average. The United States ranks 17 in reading, (range of ranks: 14 to 20) and 20 in science (range of ranks: 17 to 25). There has been no significant change in these performances over time.
Meanwhile, mathematics scores for the top-performer, Shanghai-China, indicate a performance that is the equivalent of over two years of formal schooling ahead of those observed in Massachusetts, itself a strong-performing U.S. state. 
Just over one in four U.S. students do not reach the PISA baseline Level 2 of mathematics proficiency – a higher-than-OECD average proportion and one that hasn’t changed since 2003. At the opposite end of the proficiency scale, the U.S. has a below-average share of top performers.... 


3.  The Pew Research Center's report, "U.S. Students' academic achievement still lags that of their peers in many other countries" also points to the OECD / PISA study AND several others, giving a nicely integrated overview of the data.  They put a slightly more optimistic spin on the data.  They tell us that American students' math skills have increased over the past according to the NEAP scores from 1990 - 2015, although there seems to be a small tailing off in 2015... 

Chart from the Pew study. Credit: Pew Research Center.  

They also looked at the PISA data (from above) and show the results slightly differently: 

The US position in world math test scores.  Data from PSA, chart by Pew Reearch.  



I could go on here, but you get the point.  Of the top 10 results on the SERP, 10 had bad news about the state of math education in the US.  Many of the results are from reputable sources, they expose their testing methods, and they share their data sets.  The evidence is pretty overwhelming--the US is not doing a great job teaching mathematics to their students.  There's much to do here in teaching our students how to do math.  



Our other SearchResearch Challenge was about geographic knowledge.  

How is the US doing there? 

Let's use our same approach as before: 


And by doing the same analysis (who wrote the article?  what's their bias?  why did they write this article?)  

The first article is from National Geographic, a well-known (and highly reputable) source of geographic information.  They cite a survey done for them by the Council on Foreign Relations about "What College-Aged Students Know About the World: A Survey on Global Literacy."  The upshot? 

The average score was 55% correct. Just 29% of respondents earned a minimal pass—66 % correct or better. And just over 1 percent—17 out of the 1,203 surveyd—earned an A, 91% or higher. 
Respondents exhibited limited knowledge of issues critical to the United States. Only 28 percent of respondents knew that the United States is bound by treaty to protect Japan if it is attacked. 

This doesn't really surprise me.  I live in a United States that is profoundly inward-looking.  Just out of curiosity I asked [ how many US citizens have a passport ] and found that about 37% of the population has one, compared to Canada’s 60% and the United Kingdom’s 75%. This means that nearly 2 out of 3 Americans can’t even fly to Canada, let alone travel to anywhere else in the world (according to a report from the geography department at UC Santa Barbara).  

But it's distressing.  While doing the research for this article I ran across a 2017 New York Times story, If Americans Can Find North Korea on a Map, They’re More Likely to Prefer Diplomacy, which includes this sobering image.  With North Korea in the news on a daily basis, wouldn't you expect a more accurate hit rate?

Data collected by the New York Times. From "If Americans Can Find North Korea on a Map..."

Out of 1.746 US adults who were asked to click on the location of North Korea (on an unlabelled map), only 36% got it right.  The light blue dots are all of the incorrect locations.  This is crazy.  

This has a real-world consequence.  As the author, Kevin Quealy writes: 

"An experiment led by Kyle Dropp of Morning Consult from April 27-29, conducted at the request of The New York Times, shows that respondents who could correctly identify North Korea tended to view diplomatic and nonmilitary strategies more favorably than those who could not.."

The only factor (e.g. gender, age, education, etc.) that seemed to make much of a difference in locating Korea on a map was "Do you know someone of Korean ancestry?"  

Once again, we have much to do to help our students (and ourselves) understand the world at large.  We live in an international web of countries and businesses--it's useful to at least know where they are!  


Search Lessons 


There's an obvious point here about the remarkable lack of knowledge in mathematics and geography, but that's not the goal of SearchResearch (although I personally feel this is a terrible state of affairs).  

The SRS Lessons are: 

1.  To find reliable data, look for data sets.  If an author isn't showing you the data, be skeptical.  Reliable places tend to link to their open data.  If that's not happening, be skeptical.  

2.  Our query pattern [ American knowledge of X ] seems to work pretty well.  I'd be curious to hear from SRS readers if this works well in other countries.  What did YOU find worked? 

3. Parallel browsing (by opening tabs from the SERP within the window), and then going deep on a topic in a new window, is a remarkably efficient way to do  quick broad-brush research.  




Note:  I'm about to set out on two weeks of travel in a place that might (or might not) have an internet connection.  I'll try to post next week, but if I don't post, don't worry--I'm just having too much fun diving in some exotic corner of the world!  



Search on! 

Thursday, July 4, 2019

Answer: How much DO we know about history / math / geography?


HA!  


You thought I'd gone away.  But no, it was just another busy couple of weeks.  Got the chance to give an invited talk at the American Library Association (in DC) all about the book, and then I was a discussant for a wonderful paper about interruptions at the Human Computer Interaction Consortium conference at Pajaro Dunes (near Monterey, CA).  Those were both a lot of work, but also inspiring and extraordinarily interesting.  

But it put me behind schedule.  So, here I am, back with you again to see what SRS we can do about how much DO people know about history, about math, or about geography?  

The key question was this: how would you assess "our" level of knowledge in these three areas?  What does the "public" really know? 


Figure 1.  How many Americans can describe the Declaration of Independence and what role it played in the US Revolutionary war?  Does it matter if you know what year this document was signed?  (Painting by John Trumbull, 1817-1819) 


Our Challenge: What DO we know, and how to we know what we know? 


1.  Can you find a high-quality (that is, very credible) study of how well the citizens of the United States (or your country, if you're from somewhere else) understand (A) history, (B) geography, (C) mathematics?  

As always, I'm always looking for new ways to answer questions like this.  (That is, really difficult questions to search for.)  It's easy and short to ask this type of question, but what do you DO?  

I realize that this is going to take a bit of explaining--so I'm going to break up my answer into 2 separate posts.  This is Part 1: "How much do we understand about history?"  I'll do part 2 later this week.  

As I started thinking about this, it became obvious that there are a couple of key questions that we need to list out.  In the book I call these "Research Questions," and that's what they are.  I recommend to searchers that they actually write these down--partly to help organize your notes, but also partly to make it VERY clear what you're searching for!  In essence, they help to frame your research.  

A. "How much do we...?" Who is "we"?  For my purposes, I'm going to limit "we" to currently living people in the US.  We'll touch on global understanding later, but for this round, just US.  (Of course, if you live in another country, you should do your place!)  I'm hoping we can find data to measure this across the range of ages, although it might be simpler to find just student data to begin with.  

B. ".. know about history?"  How are we going to measure this?  Ideally, we'd give some kind of history test to everyone in the US--but that's not going to happen.  An important question for us is what will count as a proxy measurement of historical knowledge?  (That is, what's a good way to measure historical knowledge?  What organization is giving the survey/test/exam?)  

Also, another underspecified part of this question is "..about history?"  Are we trying to measure World History, or just US History knowledge?  

C.  "How well..."   What does it mean to measure "how well the citizens .. understand..."?  All tests implicitly have a standard, an expectation that they're measuring against.  In this case, how should we measure "how well"?  We'll have to figure this out when we learn how "citizen history understanding" is gauged.  



I started with the obvious query: 

     [ US knowledge of history ] 

I wasn't sure if this would work, but it gave some pretty interesting results, including a big hint that "American" is probably a useful search term: 


Figure 2. 


For this kind of topic (that is, one that I'm not sure where to begin) I opened a bunch of tabs in parallel. s:  (on a Mac, you CMD+click on the link; on Windows it's Ctrl+left-click)

 This is called parallel browsing [1] [2]  and is a great way to look at a topic across its range without getting stuck in one particular interpretation.  When parallel searching, your goal is to see the spectrum of opinions on a topic.  In particular, you'll want to pay attention to how many different sources you're seeing, and what sources you're reading.  

Note how I've opened all of the top 7 search results in parallel:


Figure 3


Now, I can look at a bunch of these results and compare them.  But, as always, you want to scan the result AND check for the organization (and author).  For instance, in the above SERP there are results from TheHill.com, NationalReview.com, NAS.org, Historians.org, TheAtlantic.com, VOANews.com, and SmithsonianMag.com

Let's do a quick rundown of these sources.  The best way I know to do this is to (1) go to the organization's home page and do a quick overview scan; (2) search for the name of the organization, looking for articles about the org from other sources (and points of view); (3) search for the name of the org along with the keyword "bias."  Here's an example of what my screen looks like when I'm in mid-review, in this case, I'm checking out the American Historical Association (that is, Historians.org)...

Figure 4.  Click on this window to see it full size--that's the only way you can read the text! 

In the bottom window you can see the AHA article about "Chapter 2: Why Should Americans Study History"  (that's link #4 in Figure 3).  In the right window you can see my query:  [ "American Historical Association" bias ] -- this is a quick way to see if anyone else has written about possible biases in that org. In this case, the AHA org seems pretty kosher.  There are articles about AHA that discuss their attempts to fight bias in various forms, but nobody seems to have written about their bias.  (If you try this bias context term trick on the other orgs in this SERP, you'll find very different results.) 

An important SRS point to make:  I open tabs in parallel as I'm exploring the main SERP, but I open a new window when I'm going to go in depth on a topic (and them open parallel tabs in there, rather than in the first window).

In the lower left window you'll see the Wikipedia article about AHA.  You can see that it's been around for quite a while (chartered in 1884) as an association to promote historical studies, teaching, and preservation.  The Wiki version of AHA is that it's a scholarly org with an emphasis on collaboration as a way of doing history.  That's important, as it suggests that it's a reasonably open organization.

Now.. back to our task of checking on the stance of each of these sources.

I'll leave it to you to do all of the work, but here's my summary of these sources:


TheHill.com - a political news newspaper/magazine that claims "nonpartisan reporting on the inner workings of Congress and the nexus of politics and business."  AllSides.com (a bias ranking org) finds it a bit conservative 
NationalReview.com - shows up consistently as very conservative.  (AllSides.com and Wikipedia agree. 
NAS.org (National Association of Scholars) - pretty clearly "opposes multiculturalism and affirmative action and seeks to counter what it considers a "liberal bias" in academia.
Historians.org - (American Association of Historians) - multi-voice, collaborative institution of long standing that tries to represent an unbiased view of history.
TheAtlantic.com - news magazine with a slightly left-of-center bias.
VOANews.com (Voice of America) - is part of the U.S. Agency for Global Media (USAGM), the government agency that oversees all non-military, U.S. international broadcasting. Funded by the U.S. Congress.
SmithsonianMag.com - rated by Media Bias Fact Check as a "pro-science" magazine with a good reputation for accuracy.


NOW, with that background, what do we have in that first page of results?

TheHill.com reports that
"...Only one in three Americans is capable of passing the U.S. citizenship exam. That was the finding of a survey recently conducted by the Woodrow Wilson National Fellowship Foundation of a representative sample of 1,000 Americans. Respondents were asked 20 multiple choice questions on American history, all questions that are found on the publicly available practice exam for the U.S. Citizenship Test."
Okay, now we have to go still deeper and do the same background check on the Woodrow Wilson National Fellowship Foundation.  Using the method above, I found that it's a nonprofit founded in 1945 for supporting leadership development in education.  As such, they have a bit of an interest in finding that they're needed--for instance, to help teach history and civics. 

But the survey mentioned above was actually conducted by Lincoln Park Strategies, a well-known political survey company that's fairly Democratic, but also writes extensively on the reliability of surveys. (So while I might tend to be a little skeptical, a survey about historical knowledge is likely to be accurate.) 

The key result from this survey is that only 36% of those 1,000 citizens who were surveyed could pass the citizenship test.  (See a sample US citizenship test and see if you could pass!)  Among their findings, only 24 percent could correctly identify something that Benjamin Franklin was famous for, with 37 percent believing he invented the lightbulb. 

Note that this survey implicitly answers Research Questions B and C (from above):  How do we measure historical knowledge?  Answer: By using the Citizenship Test.  And, How well do people do on the test?  Answer: A "good" grade would be passing, that is, the passing grade for a new citizen. 


What about the other sources? 


The National Review article reports on a 2016 American Council of Trustees and Alumni report that historical knowledge is terrible ("... less than a quarter of twelfth-grade students passed a basic examination [of history] at a 'proficient' level.").  


Now we have to ask again, who/what is the "American Council of Trustees and Alumni"?  The short answer:  part of a group of very conservative "think tanks" and non-profits that are closely linked to far-right groups (e.g., the Koch Brothers).  


So, while that information could well be true, we realize that there's an agenda at work here.  (I did look at their survey method as reported in the report above, and it seems reasonable.) 

Meanwhile, the National Association of Scholars points to the US Education Department’s National Assessment of Educational Progress quadrennial survey, The Nation’s Report Card: U.S. History 2010.  Looking at the original report shows that the NAS article accurately reflects the underlying data.  While average scores on the test have improved over the past several years, the absolute scores are terrible.  As they write: "...20 per cent of fourth grade students, seventeen per cent of eighth graders, and twelve per cent of high school seniors performed well enough to be rated “proficient.”   It looks even worse when you invert those positive figures: eighty per cent of fourth graders, eighty-three per cent of eighth graders and eighty-eight per cent of high school seniors flunked the minimum proficiency rating." 

Wow.  That's pretty astounding. 

Continuing onward:

The Historians.org article ("Chapter 2: Why Should Americans Know Their Own History") is an argument for teaching history, but has no data in it.  However, Chapter 1 of the same text at the same site talks about the data, but the crucial figure is MISSING.  (And I couldn't find it.)  So this doesn't count for much of anything.  

In that same vein, The Atlantic's article "Americans vs. Basic Historical Knowledge" is really a reprint from another (now defunct) journal, "The Wire." This article decries the state of American students with a bunch of terrifying examples, but it points to yet another set of data that's missing-in-action.  

The VOA article, "Poll: Americans’ Knowledge of Government, History in ‘Crisis'" is also ANOTHER reference to the American Council of Trustees and Alumni  survey of 2016 (referred to as the data source for the National Review article).  This article is basically a set of pull quotes from that report.  

What about the pro-science magazine, Smithsonian?  Their article, "How Much U.S. History Do Americans Actually Know? Less Than You Think" says that the 2014 National Assessment of Educational Progress (NAEP) report found that only 18 percent of 8th graders were proficient or above in U.S. History and only 23 percent in Civics.  (See the full report here, or the highlights here.)  

Figure 5. NAEP history test scores for 8th graders, 1994 - 2014.
Figure 5 shows an excerpt from the full report, and when I saw it I thought it looked awfully familiar. 

Remember the National Association of Scholars article from a few paragraphs ago?  Yeah, that one.  Turns out that this article and that article both point to the same underlying data.  That is, the National Assessment of Educational Progress (NAEP)!  This article points to the updated 2014 report (while the earlier article's data is from 2010).  This doesn't count as a really new data set, it's just an update of what we saw earlier.  What's more, the update in four years isn't statistically different. It doesn't count as a separate reference!  

Sigh.  

So what we have here, in the final analysis of the 7 webs pages are: 

     a. the NEAP data set (from 2010 and 2014)
     b. the American Council of Trustees data set  (2016)
     c. the Woodrow Wilson survey (which has a summary, but not much real data)  

Everything else is either missing or a repeat.  

I went through the next couple of SERP pages and while I found lots of articles, I found that almost all of them basically repeat the data from this handful of studies.  

As it turns out, these three (and the few other studies I found that were about specific historical questions, rather than history broadly speaking)  all agree:  We're not doing well.  In particular, on normed tests, or the Citizenship test, Americans don't seem to know much about their history.  

Of course, this alarm has been raised every few years since at least 1917 when Carelton Bell and David McCollum tested 668 Texas high school students and found that one third of these teens knew that 1776 was the date of the Declaration of Independence.  [3] Like that.  

It's a sobering thought do consider this on the July 4th holiday.  (Which is, coincidentally, our agreed-upon celebration date of the signing--even though it took several days to actually sign the document, as the signatories were scattered across several states!)   Like Bell and McCollum, I worry... but perhaps this is an inevitable worry.  To me, it suggests that teaching and education need to remain permanent features of our intellectual landscape.  

As should search.  


Search on!   






Search Lessons 


There's much to touch on here... 

1.  You have to go deep to look for redundancy.  Just because we found 10 separate articles does NOT mean that there were 10 different studies that all agree.  In this set of articles, there are really 3 common pieces of data.  

2.  Use parallel browsing to open up tabs that branch off the same SERP, and then use different windows to go deep on a particular topic.  That's what I do, and it certainly makes window management simpler!  

3.  Beware of lots of 404 errors (page not found).  If a publication can't keep references to their own pages up to date, you have to admit to being skeptical of their work overall.  It's inevitable to get some link-rot errors, but they shouldn't be common, as they were in some sites I visited here.  (Hint:  If you want to write scholarly text that lasts, make sure you keep a copy of the data your article depends upon.) 





[1] "Parallel Browsing Behavior" Huang and White.  Proceedings of the 21st ACM conference on Hypertext and hypermedia. ACM, 2010.

[2]  "Online multitasking and user engagement." Lehmann, Janette, et al. Proceedings of the 22nd ACM international conference on Information and Knowledge Management. ACM, 2013. 

[3] Bell, J. Carleton, and David F. McCollum. "A study of the attainments of pupils in United States history." Journal of Educational Psychology 8.5 (1917): 257.



Friday, June 28, 2019

A little hiccup before the answer to... How much DO we know about history / math / geography?


It's been a busy week... 

... for me, and I just couldn't quite get the time to write up the answer to this week's Challenge.  

On the other hand, it's been a very fun week.  I was at a small conference (the Human Computer Interaction Consortium) near Watsonville, CA.  (You could look it up!) But I had to suffer with a conference venue that looked largely like this:  



I know, it's a tough life... but someone has to do the research!  And, in truth, these small conferences are the best possible venue for professional communication. 


But between attending the conference (not on the beach, but in ordinary-looking conference rooms), holding down my "regular" job, and even more traveling for family reasons, there just wasn't the time.  

I've been thinking about SRS and exploring how to best answer the Challenge... and early next week I'll do a couple of posts on this large and interesting question of "how do we research the level of knowledge of different people" on a given topic!  This is not one of the easier SRS Challenges we've had.  

Keep searching... and I'll be back early next week with my answers and reflections on what makes this kind of question so tricky.  

Search on! 


Monday, June 24, 2019

A free chapter of my new book, "The Joy of Search"


My new book "The Joy of Search" now has a free chapter available!  

(Thanks, MIT Press.) 

If you'd like to try before you buy, you can read a sample chapter and get a great sense for what the rest of the book is like.  

Click here for the free sample chapter of "The Joy of Search."  

Enjoy!  

(And if you've got comments, please leave them below.)  




Wednesday, June 19, 2019

SearchResearch Challenge (6/19/19): How much DO we know about history / math / geography?


It's common to point out that people don't know much about much... 

And it always makes me wonder:  How much DO people know about history, about math, or about geography?  

More importantly, how would you assess "our" level of knowledge?  What does the "public" really know? 

This came up for me the other day when I was chatting with someone who (we discovered) didn't really know where Syria was. Is it near Iraq?  How close is it Turkey? 

That struck me as odd because Syria has been in the news for the past several years as it struggles with an ongoing civil war. Surely they must have seen a map of the country and its position in the Middle East!  And yet, the location somehow didn't stick in their brain.  

How many Americans can describe the Declaration of Independence and what role it played in the US Revolutionary war?
Does it matter if you know what year this document was signed?  (Painting by John Trumbull, 1817-1819) 


Last year, in 2018, I heard a brilliant talk by Roddy Roediger about what our collective memory is for historical events.  Who won World War II?  If you have an interest in education (especially history), it's worth an hour of your time. 

This brings me to this week's SearchResearch Challenge.  What DO we know, and how to we know what we know? 

1.  Can you find a high-quality (that is, very credible) study of how well the citizens of the United States (or your country, if you're from somewhere else) understand (A) history, (B) geography, (C) mathematics?  

In this Challenge I'm hoping to learn some methods for finding reputable resources for assessing broad public knowledge.  Next week we'll discuss some of the SRS methods I use when I try to answer questions like this.   

And more importantly, for our SRS purposes, how does one frame a question like this to a search engine in order to find those resources?  AND... how do you assess the quality of the resources that you find?  

Obviously, asking a few friends a couple of calculus questions isn't a great way to measure the public's knowledge of math.  Doing a man-in-the-street interview of geography questions also probably doesn't work well either--so what would work well?  

In other words, what can one do to make a measurement like this?  How can you tell how much the citizens of your country know?  

Obviously, this kind of Challenge can take an arbitrary amount of time.  But if I can motivate you to spent a few minutes searching for this kind of information, I think you'll get a good sense of the issues involved.  

As always, please let us know how you discovered the sources that you find credible.  

Search on! 




Tuesday, June 18, 2019

A talk in DC at the American Library Association conference (Saturday, June 22, 2019)


For folks in the DC area... 


My "Joy of Search" book tour continues with a stop at the American Library Association (ALA) this coming Saturday, June 22, 2019. 

If you happen to be attending the conference in DC, I'm talking at 10:30AM in room 145A at the DC convention center. (You need to register for the conference...)  

Come find me there! 

(The books are still in production, so I won't have any to sign.  That will be coming soon!) 






(Note that if you can't make this event, don't panic, there will be more. I'll post events as they happen, and even try to organize a Meet-Up or two in the process.  I'll even come back to DC in September.)  


Thursday, June 13, 2019

Answer: Unusual sports?


The odd and unusual are fun to search for!  


As we noticed last week, there are lots of fascinating sporting 
(if you use that term loosely) events in the world, many of which deserve a bit of background research.  

I love these kinds of SearchResearch Challenges because I always learn a few fascinating things along the way.  These slightly whacky events are perfect for a quick SRS lesson.

Let's dive into this Challenge... 



1. As Europe grows increasingly warmer with the passing years, will the long distance ice-skating race that passes through 11 Dutch towns still be able to be held?  What's that event called?  When was it last held (and who won)?  

Like many of you, I did the obvious search: 

     [ ice skating race 11 Dutch towns ] 

and found a number of sources that told me this is the Elfstedentocht a nearly 200 km ice skating tour of 11 towns in the Netherlands.  The race goes on canals, rivers, and lakes that are frozen over.  The problem, of course, is that the canals don't freeze sufficiently every year.   (I found the official site in Holland by doing [ elfstedentocht site:.nl ] limiting the search to just websites in the Netherlands.)

It was last held on January 4, 1997, [1]  and the prospects for future events is fairly grim. An article in the Washington Post comments that "... the Netherlands is no longer a romantic wintry wonderland, and there hasn’t been an Elfstedentocht since 1997, marking the longest drought ever between races. Climate change has endangered the race and is slowly dousing hopes across the province.  ...A lot of people really think that there will never be another one.” [2]    "In the past century, the average annual temperature in the Netherlands has increased by about 3 1/2 degrees, according to Peter Kuipers Munneke, a researcher and polar meteorologist at Utrecht University. He says in recent decades winters have warmed more than the other seasons, thanks in part to westerly winds coming over the North Sea."  [2]  

Yeah.  Here's a chart from that article that makes the point clearly. 



Sigh.  Don't hold your breath waiting for the next elfstedentocht



2.  Several of us in the discussion were former collegiate volleyball players, but since it was a very international evening, other folks started to tell us about different versions of volleyball that are played with feet alone.  Is this for real?  How could you bump/set/spike a ball with just your feet?  If so, what is this sport, and where is it played?  (Participants insist there are at least 2 different versions of this sport.) 

To answer this, I did:

     [ volleyball with feet ] 

And saw this as the results...


Clicking on a few of these results show me that Sepak takraw is a version of volleyball that's played with the feet and a rattan ball.  It's certainly impressive if you watch a video or two.  (Example video of Sepak takraw.)  Those guys are wild!  They serve, bump, set, and SPIKE the ball with incredibly athletic leaps (and incredibly graceful recoveries).  

I was thinking, though, that I had heard of a South American version of this sport.  Why didn't it show up here?  All I can see are the results about the Malaysian version of the sport.  

So my next query was intended to find results that are NOT about Malaysian sports: 

     [ volleyball with feet -Malaysia ] 

That is, I want to see this query without all of the Malaysian results, so I used the MINUS operator to exclude all results with the term "Malaysia." I wasn't terribly surprised when I found many results from Brazil for their sport of footvolley.  




It is also a beautiful sport that is often played on the sand at famous Brazilian beaches (like Ipanema), which looks to be crazy hard.  Imagine trying to jump high enough to kick the ball over the net... while starting on the sand!  (Another video worth a watch of footvolley played on sand.

3.  Although the next summer Olympics are still a year away, we started talking about former Olympic events that aren't held any more.  Tug of war at the summer Olympics (1900-1920) is a famous example of a now discontinued sport.  While there seems to be an endless number of swimming events, was there ever a swimming event that was held underwater?  If so, what was it?  When was it last held?  Is there an Olympic champion?  
A query like: 

     [ Olympic underwater swimming ] 

quickly leads you to several sites that tell us that underwater swimming WAS a thing at the 1900 Olympics.  [Olympic official site, SportsReference]  This somewhat odd event was in the Olympic Games only in 1900. Two Frenchman, Charles Devendeville and Andrés Six], won first and second place. However, the French publication, Journal des Sports, noted that the third-place finisher, Peder Lykkeberg, was the best overall. However,  Lykkeberg swam in a circle, swimming much more than 60 meters, but the official distance was measured only in a straight line from the starting point with two points for each meter swum and one point was awarded for each second swum underwater.  (He swam for 90 seconds!  Who knows what happened there?!?)  

Oddly, this doesn't sound like much of a spectator event.  All you'd see is a blurry image of someone swimming underwater for 1.5 minutes or less.  I can see why they didn't repeat the event.  (It's about as exciting as plunge for distance held in the 1904 summer Olympics, which is the same event, except you can only glide to the end, you can't kick or paddle so there's even less to watch.  Talk about dull!)  

Search Lessons



This week's Challenge wasn't that hard, although as always, there are sometimes nuances that require a bit more search skill than usual.  In this case, just choosing good search terms is (mostly) enough.  But in the case of foot-volleyball... 

1.  Expanding your search results by removing consistent terms can sometimes lead to surprising results.  In this case, the first query [ volleyball with feet ] gave us good results, but because Malaysia was SUCH a big part of those results, I thought about trying the query WITHOUT Malaysia.  That's how I found the  Brazilian version of the game.  (And yes, I did another query that was [ volleyball with feet -Malaysia -Brazil ], but it was clear that I'd fallen off the "good results" list at that point.  



Search on! 



Book update 


Sorry about being a day late with this week's SRS answer, but I have a good excuse... 

I spent yesterday in the lovely town of Victoria, British Columbia, giving my first book talk about The Joy of Search.  I was the keynote speaker at a small conference in the Computer Science department where I was able to hand out some of my postcards with the book information.  It's odd to have a book talk sans book, but the marketing collateral helps!  More talks to come, including one at the American Library Association conference next week.  If you're at the ALA meeting, come by and say hi!  (My talk is at 10:30AM on Saturday, June 22, 2019.)  




-->