Wednesday, March 21, 2018

Creating Fun & Quick VFX-SFX w/ FFMPEG: Alien Interference Video

Motivation: Video is the "new" paper in our digital world.  Sometime in year 2007, more than 50% of the Earth's population owned a cell phone -- and in 2018 some 36% have a "smart camera phone" -- able to play video -- and growing.

This author had many clients & associates who needed / wanted video to help promote their product or service.  And most recently, a need to demonstrate -- "tease" -- creation & editing of a humorous short video.

First some nerd & creative details making a "feasibility" & "tease" video -- and Second, hit some key management lesson learned from making a "simple" video ("Hey, what's the big deal -- my 5 year old did that on his iPad?" -- Yes, he did.  Being 5 years old is a key factor: At this early age, the mind is free of too many life assumptions).

Project Challenge: Quick & simple generation of video & audio special effects?  A humorous Alien takeover of video signal?  And less ominous and spooky than 1963 Outer Limits intro roll -- "We are controlling transmission." (see 60 sec intro video)

More specific video creation challenge: Create & overlay TV Snow with "only" a Linux-Win32/64 laptop? How does a video creator-editor add TV Noise (snow) to a short & fun promo video clip -- without resorting to some complex Rube Goldberg mix of an A/D converter & digital recording from some noisy analog source?  In other words, how to avoid cabling together a herd of odd & old video boxes to obtain "junk" video for VFX-SFX (special effects)?

Quick Answer:  Use "simple" command line tools like FFMPEG & FFPLAY.  Reverse 40+ years digital video evolution -- and synthetically generate old TV snow -- to "fuzz up" a perfectly good video clip -- for fun and profit.

Quick Look Result?  Here is 8 sec simple test & project budgeting feasibility video clip -- that proved out the FFMPEG VFX-SFX approach illustrated below.

8 sec Video: Alien noise signal takeover.

Goals & Director Details?  Starting with "the script":
  • Cold open into TV "snow" (video & audio noise), just like the "old days" when the analog TV powered up on a "blank" channel.
  • Cross fade from TV snow to prefix Intro slide (changing the channel or TV station coming "in range")
  • Cross fade via TV snow from intro slide to cartoon Alien (the "episode" segment -- the Alien "take over").
  • Cross fade to trailer Outro slide then fade to black.
Video Editor -- Quick Nerd Details: Using a refurbished "junk" 7+ year old laptop -- running Linux-Knoppix 8.1 -- the following BASH commands made some digital video magic.  Please note that video resolutions are very small to (a) "fail fast, fail forward" (aid experimentation), (b) keep video processing within "reach" of old laptop -- and (c) pick up some useful visual effects during upscale to 16:9 aspect for "modern" video. Details and steps:

(1) Seed the TV snow transition/cross fade video clip: Generate a very small video & audio from random numerical sources at QQVGA resolution (160x120 pixels, 4:3 aspect):

# -- Generate 5 secs QQVGA res video. Signal? random number source
# -- Note that filter_complex comes in two parts:
# (a) Video gen: "geq=random(1)*255:128:128"
# (b) Audio gen: "aevalsrc=-1.5+random(0)"

ffmpeg -f lavfi \
       -i nullsrc=s=160x120 \
       -filter_complex "geq=random(1)*255:128:128;aevalsrc=-1.5+random(0)" \

       -t 5 out43.mkv

Playing created video "out43.mkv" -- a screen shot preview:
Synthetic Random TV snow
(video noise & audio)
(2) Up Convert & expand 160x120 video to "nearest" & "next" 16:9 video aspect -- 854x480 pixels (aka FWVGA). This step will expand & "smear out" small pixels into a more analog realistic set of "blurry" noise pixels. Audio is just copied across. Command line details:

# --- Now resize to FWVGA (smallest 16:9 aspect "std") ---
# --- Note use of "-aspect" to "force" Mplayer & others to use the 
# --- DAR (Display Aspect Ratio) and no rescale to 4:3 or some other


ffmpeg -i out43.mkv -vf scale=854:480 -aspect 854:480 out169.mkv
#
# Note and caution: By default FFMPEG version 3.3.3-4 generates 
# pixel_fmt "yuv444p" & often will NOT play on iPhones
# some versions of iMovie, QuickTime and Kodi media player 
# (Kodi just blinks & ignores) Correction? Two options:
#  (1) add command line switch "-pixel_fmt yuv420p"
#  (2) or make certain video editing & video rendering 
#      engine outputs yuv420p pixel format (OpenShot!).

Output & Screen shot result?
Noise pixels expanded to 854x480 for 
video overlay & fade to "Alien Takeover"
(3) Generate Intro & Outro slides using ImageMagick and BASH script -- this is VERY FAST and saves the video rendering engine ("melt" in OpenShot) much computational work by stacking many static images into 4-10 secs of video:

#!/bin/bash 
# ==========================================================

#  Revised:  14-Jan-2017 -- Generate Slide from Text String

# See:
#   http://www.imagemagick.org/Usage/text/
#   http://www.tldp.org/LDP/abs/html/loops1.html
#   http://superuser.com/questions/833232/create-video-with-5-images-with-fadein-out-effect-in-ffmpeg
#
#  Hint for embedded newlines (double slash):
#    ./mk_slide_command_arg.sh How Long\\nText Be?
#
# ==========================================================

if [ $1 ];then
  # echo "First Arg: " $1
  /usr/bin/convert -background black \
    -fill  yellow      \
    -size 854x480     \
    -font /usr/share/fonts/truetype/dejavu/DejaVuSans-BoldOblique.ttf \
    -pointsize 46     \
    -gravity center   \
    label:"$*"        \
slide_w_text.png
else
  echo "Need at least one predicate object token"
  echo "Try: " $0 " A_Slide_String_w_No_Quotes "
  #    ./mk_slide_command_arg.sh How Long\\nText Be?
fi

#

Slides generated appear thus:
Static Slide / Pix for Intro

Static Slide / Pix for Outro

(4) Generate "injected" Alien signal / slide -- the "episode" segment -- using a simple cartoon creation with a digital paint program -- and on 854x480 pixel canvas (16:9 aspect ratio):
"Injected" Aliend signal slide
(cartoon) from paint program
(4) Video-Edit & Merge intro, episode & outro slides -- as video segments -- and use synthetic TV snow cross fade transitions.  In this example -- OpenShot 1.4.x was used.  A screen shot from edit session -- note TV snow video is in lower track -- and "fills" when upper tracks "fade" to black. This yields the viewer impression that some nefarious "Alien" signal is being injected ("Controlling your TV!"):
ScreenShot: OpenShot 1.4.3 Video
Editor w/ time line loaded segments
and cursor at 4.0 secs
(5) Render / trans-encode video segments into a common video / audio format -- and container.  In this case libx-h264 for MP4 video and AAC for audio.  

(6) Results?  Scroll back to 8 sec YouTube clip above.  We started with the end result.  Yes, kinda "simple" and childish -- but please recall the goals:  (a) time & budget estimation for video creation & editing, (b) Avoiding wiring together a bunch of old hardware & video signal generator to make a small VFX-SFX effect, and most important (c) A simple "laugh test" to check if the promo-Ad approach was a useful "viewer outreach" path.

Lessons Learned #1:  Surprise!  iPhone, iMovie, Quicktime, Kodi Media player -- and others -- can bomb (fail) with certain color pixel formats? Apparently iPhones are strangly spooky.  Say it ain't so, Steve Jobs. Corporate software policy or institutionalized bug? See iOS and Quicktime details.  Also see significant answer by DrFrogSplat from Jan 2015.

Intro slide from 1963 TV
show "Outer Limits"
Lesson Learned #2 from creative side -- Perspectives and Deeper History: Sometimes many years of experience & exposure can confound & confuse the creative process -- and fog "simple" workflows.  Ghosts of a past technical life muddles & blocks the mind -- and reaching back to a 5 year child like mindset -- is often required. "How would a 5 year old make this 'tease' video?"
Master Control" TV 
Switcher Room
Back in fall 1980, this author worked as a "weekend program switcher" in "Master Control" for PBS station KNEW TV-3.  Lots of expensive equipment. $50K and $100K video processing "boxes" everywhere. Job was tough Saturday & Sunday gig -- 5 AM to 6 PM -- unlock, power up the studio & switching equipment, carefully energize the 100 kilowatt transmitter, setup multiple VTRs for satellite "dubs" of time-shifted TV programs (i.e. video downloads), feed promos (Ads) & TV programs to the transmitter -- all according to a ridged program schedule -- and then rinse, wash & repeat every 30-60 mins.  
1980s broadcast quality video tape: 
a 1969 RC 2" inch quadruplex VTR 
And much on and off air exposure to Sesame Street, The Electric Company, Master Piece Theater, etc etc.  To add turbulence, broadcast students would drop in to edit project videos.

Legacy Thinking? The fall 1980 weekend gig was great experience -- but filled my head with lots of legacy hardware / software / video production perspectives -- and 36+ years of technology has marched on -- all that old hardware & video production is long gone.

Or not?  Color Mathematical Models to save bandwidth?

Lesson Learned #3: Turns out the video signal theory & processing practice of analog TV & video are not 100% "gone" -- but live on in creation & editing of "modern" digital video -- even for "simple" jobs -- or for "eye catcher" web publishing-promotion -- or for distribution via YouTube / Vimeo / etc. 
TV Picture: Numerically "perfect" 
colorbars for video signal "color burst
phasing & "hitting" color calibration
points in Ostwald "color solid"
(see below)
The spirit of Analog color -- and human perceptions -- "lives on" in digital video:  While the technical details are gory and deep -- suffice to say -- playing with FFMPEG was a pleasant surprise -- I had forgot about details like YUV color space and 444 vs 420 pixel encoding.  And it was a shocking surprise that iPhones, iMovie, Apple Quicktime, Kodi media player and others can -- and do -- "blow up" with yuv444p default pixel format of FFMPEG.

From Wikipedia, HCL color model  
(double cone) and is foundation
for YUV color model of old  
analog video & "buried" in pixel
formats of digital video.
This "old" but very sophisticated YUV "blending" & "tinting" of color -- over black & white "backbone signal" -- a legacy of the 1950s TV tech transition from Black & White to Color -- is not "dead" -- but alive as an almost corporeal "ghost" in "modern" digital video editing & processing.  Why alive? Human perception of color & movement in cluster clouds of analog or digital pixels has not changed.  The hardware has change much -- but how humans "see" has not.  And exploiting how humans "see" still needs to be exploited to reduce bandwidth and video storage costs -- this demand has not change in 30+ years.

How does YUV magic happen?  Pictures are worth 1000s of words -- some old 1980s -- and some 1916 color theory -- that hint at the hardware & software technique used -- and help video renderings "reach into the mind" of the viewer -- and "exploit" their color & movement perceptions -- while also "compressing" the signals / data storage costs.

Vectorscope plot of "perfect"
colorbars test signal & scope
beam "hitting" the calibration 
marks for Red, YL, Green, CY, Blue 
and unseen black to white running
up & down the circle center.


From Wikipedia HSL and HSV "color solid";
The Vectorscope "lives" in the equatorial belt
of Ostwald's 1916 color concept & the
vectorscope "hits" specific theoretical 
colors in this solid to "stablize" video
colors.

SUMMARY:  When you are in college -- or attending formal technical school -- a lot of theory is emphasized -- and as a student -- you wonder "when are we gonna get to some practical 'hands on' manipulations?"  The value of abstract theory is often not appreciated until many years later -- when the theory allows "seasoned students" to "see the unseen" -- when "modern" software stumbles  ("Huh? iPhone cannot render YUV444P format pixels? What does that mean?") -- even if masked by 35+ years of technological change.  Viva la abstractions! 

Tuesday, March 20, 2018

Video: Crescent Moon Chasing Golden Sunset, Venus Leading the Way

Doing digital time lapse photography-videography is both a crap shoot -- and sometimes captures serendipitous & fun surprise results.  Time lapse of sunset also "caught" Moon set -- and semi rare conjunction of Venus / Mercury. First the 90 sec time lapse video results -- then some details:

Time lapse: Crescent Moon chasing Golden
Sunset w/ Venus leading the way (90 sec)

Fun & serendipitous "extra" capture? Even w/ this old tablet camera & hardware, accidentally "caught" semi-rare conjunction of Sun, Moon, Venus & Mercury.  Sky chart detail:

Western Sky Sunset & Schematic
of Moon, Venus, Mercury & Sun

More 2018 sunset details?  See this 3 min animation for 2018 Venus set vs Moon set:

Western Sky Sunset & Animation of 
Venus sky path thru 2018

Why this effort?  Video is the new paper in a digital world.  This author is on a steep "catch up" learning curve -- after 20+ years not really "doing" photography -- now exploring "dry" digital tools & techniques. When photography was "wet" -- film, chemicals, developers, enlargers, hang dry, etc etc -- the work flow friction became too high -- and faded from my skills repertoire.  Now that every "smart phone" -- even inexpensive smart phones -- most have surprisingly "good" cameras -- and beat hands down the workflow friction of 35mm film.

Nerd Details: This was a sunset time lapse test of refurbished 2011 Samsung 7" Tab2 (PT-G3113) Android tablet using "Open Camera" app, version 1.42.2 by Mark Harmon. Frame rate: 2.0 secs. Video assembled from 2286 JPEG images into 30 fps video using FFMPEG Linux shell command:

 cat *.jpg | ffmpeg -r 30 -f image2pipe -i - out_video.mp4 

How does this command work?  How are single still images "stacked" into a video?  Without wading into the deep details, starting in a folder populated by 2286 JPEG images -- captured & stored in time sequence order -- the "cat *.jpg" feeds one image file after another by "Globbing" *.jpg files into  a Unix-Linux "pipe" (the vertical line character "|") -- and images emerging from this pipe are "read" by FFMPEG -- by the UNIX-Linux "stdin" token dash ("-") -- one after the other -- and assembled frame-by-frame into a video encoded file "out_video.mp4".  There are herds of defaults exploited by this "simple" command.  See details at: https://www.ffmpeg.org/faq.html 

SUMMARY?  In a digital world, video is the new paper.  Most folks will watch a 30-120 sec video -- BEFORE they will read an Email. To speak to this challenge & need -- learning to "write" in video short stories is a key skill.  And fun tool, now that the "wet" photography process has been captured by low cost camera phones.

Friday, October 27, 2017

GPS Accuracy from SmartPhones with SBAS

How XY accurate are C/A (code only) GPS-GNSS receivers on mobile phones?  What if they have SBAS for "regional" differential corrections?

Perhaps a simple test example.  Recently, this author upgraded his mobile phone -- from a beloved circa 2012/13 Android running Gingerbread 2.3.6 -- to a 2016 ZTE Z981 running Android 6.0.x.  Onboard both phones were GPS chipsets -- and the newer ZTE was found to be running a combo GPS-GNSS chipset -- via semi-raw data manifest by SatStat sensor app from the open-source F-Droid repository.

In a recent lunch visit with old client -- the GPS XY accuracy of the ZTE with SBAS arose in tradeoff discussions -- for use in casual field surveys -- verses more expensive L1 and L2 carrier wave and carrier phase tracking systems. Too much to explain here -- perhaps a diagram to tease the curious as we move onto simple results:

GPS Carrier wave & code modulation.
Inphase & Quadrature signal combos
(from wikipedia)

FIELD TEST: Rather than waste time searching for ZTE Z981 GPS-GNSS chipset hardware specs -- and believing the published accuracy claims -- it was just easier to collect a herd of static positions -- and plot on aerial photo.

So one nice afternoon -- while writing a technical proposal -- the ZTE Z981 was setup to log static GPS-GNSS observations from under moderate tree canopy -- and inside a Gazebo with polystyrene tarp roof (i.e. almost 100% transparent to GPS signals at approx 1500 mhz).  The ZTE phone-GPS was set on a metal "garden table" that acted as RF ground plane and block some ground bounce multipath.  

To log  observations, this author downloaded a GPS-GNSS app called uLogger from the F-Droid repository.  uLogger has the ability to set 10 sec GPS-GNSS observations -- and ignore any internal accuracy tests.  And uLogger exports to open-source, easy to read-convert GPX exchange file format.  (download the GPX file ).

RESULT -- Collected 881 GPS-GNSS observations -- using "code only" and with SBAS corrections.  Plotting on local aerial photo -- with 10 and 20 meter range rings resulted in this plot:

881 GPS-GNSS observations at 10 sec intervals
Collect from author's yard Gazebo and plotted
over recent aerial photos. Scale rings are 10 and
20 meters from approx "average" center.

Good?  Bad?  Ugly?  How does this compare?  To circa 1999-2000 GPS hardware?  Pictures are worth 1000s of words -- and digging thru the archives -- compares with 1996 Garmin 12XL ($600 at time) -- and a $12,000 L1-L2 tracking Novatel.  

Garmin 12XL and approx 24 hours of GPS observations
at 4 sec log interval. Range rings are 100, 200 & 300
meters.  Large elliptical loop patterns are examples of 
pre-May-2000 deliberate GPS accuracy degradation

Circa 1996 Novatel L1 & L2 in "free float" mode 
(i.e. no differential corrections). 24 hours of 4 sec 
interval observations collect from same antenna 
location as Garmin 12XL. Note scatter plot vs 100
-200-300 meter range rings -- and how the Novatel 
was able to "computationally resist" the degradations 
and deterministic elliptical loops of selective availablity.

SUMMARY -- Upshot?  The ZTE Z981 is approx $100 USD smartphone.  The GPS-GNSS chipsets beat -- hands down -- much more expensive GPS hardware from late 1990s.  With care and practical use understanding -- even "simple" smartphone GPS-GNSS hardware can be used to collect XY mapping data for certain types of surveys.

Sunday, September 3, 2017

KML-KMZ Bugs: For Google Every File is a URL

Recently a friend was struggling with KML-KMZ so called "ground overlays" -- draping local maps and imagery over specific map regions in desktop & mobile application Google Earth.

Specifically, aerial data processing via DroneDeploy -- and other photogrammetry apps -- and exported via KML-KMZ data "containers" -- when viewed in Google Earth desktop -- was producing the dreaded BIG RED X -- like this:
Google Earth: Big Red "X" for broken URL links.
Why does this happen? Before we dive into specifics, perhaps it is wise to recognized this key software design premise "baked" into every Google application and product: There are NO "files" on the web -- every file is really a web link.

KML-KMZ RAPID DIAGNOSIS?  Generally two possibilities before we can mental "see" the necessary fix:

(1) Missing local file: The image map overlay is missing from the local computer / mobile device -- or is NOT in the same folder path from which the KML-KMZ was launched (or opened), or,

(2) Busted URL (Universal Resource Locator): The KML-KMZ file contains a "broken link" -- a link that FAILS to point to actual internet "cloud" location.  This broken link can be "busted" by just a single character -- and Google Earth will be clueless -- and paste the Big Red X in the image map overlay space -- signally that Google Earth "knows" some image map *should* be there -- but fetch of the URL failed (404 file not found).

NERD DETAILS:  Opening the "broken" KML file in a plain ASCII or UTF-8 text editor -- GVIM (or MS Notepad ) -- and examining the guts -- we note the key <href> line:
Screenshot: Plain text editor GVIM internal view of 
"busted" KML file and generating Big Red X
(click for bigger)
The "fix" is simple -- from inside a plain text editor -- force the HREF line to "point" to an appropriate image map overlay -- stored -- somewhere -- in the internet cloud.  

Somewhere? Where to store the Google Earth image map "ground overlay"?  Two places -- one the KML-KMZ user can control -- and another that is more "ownership" problematic over the long term (months to years):

(1) Private Web Server w/ Public Exposure: Store the image map overlay on your personal web server -- a web server that exposes a PUBLIC link the world wide web.  For this author: Upside? This is a preferred method -- as "ownership" and control is well defined.  Downside? Requires build or purchase of a web server (i.e. Blue Host, Hostgator, Linode or Tektonic or related).

(2) "Public" web server: Store the image map overlay "in the cloud" -- and with publicaly exposed "hard" link that "reaches" into Google Photos -- or Imgur -- or Amazon S3 -- or some other "photo sharing" website -- THAT CAN AND WILL EXPOSE a "permanent" public link with *NO* demands for "log in." (The big offender here is Facebook and others attempting to enforce viral marketing of their "walled garden").  

CRAZY and OBSCURE LINKS: "Cloud Storage" w/ commercial vendors like Google -- or Imgur -- or Amazon S3 -- is tricky -- because "permanent" and hard links are kinda wild and tricky to determine and extract for use in human edited-updated KML-KMZ files.  Example -- for this topo image map we examine how stored in Google Blogger:
1976 USGS Topo Map "image overlay" for 
Cibolo Nature Center near Boerne TX
(click for full resolution)
The public, "permanent" link looks like this wild mess :

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi0cNC7K7VEpK-PRj8Z_vmULcTGb8X5TNMMt-502WhLDmjvyFP2Ww041Bm0cjnrSIkj2v5VenCSMMZLkxOplckG6nflSX9nmTBsFtJlGyDKBy6xX0HK500K-I03u73DldGL788CsakpE2A/s1600/cibolo_nature_usgs_topo_1976_83dd_v0.png

(makes sense to a web browser or mobile app -- bizzare to any human eye not attached to nerd programmers)

Why so ugly? Google Blogger is "storing" the image map overlay in some deep database index that only makes "sense" to your browser or web app -- and the "Google System."
  
MIXING THE MAGIC: How to FIX the BIG RED X?  And use Google "cloud storage" inside this KML?  Edit-update the KML-KMZ file -- so that the HREF line utilizes the big ugly Blogger image link of above (or a "simple" URL on your personal web server).  Specifically for this example:
Corrected KML file with wild, long and 
ugly "permanent" URL link into Google
Blogger "cloud" storage (click for bigger)
SUCCESS?  First a Google Earth screen shot:
1976 USGS Topo image map ground overlay
properly draped inside Google Earth and 
over Cibolo Nature Center near Boerne TX
(click for bigger)
And click (or tap) this link to view the "live" KML file inside Google Earth.

SUMMARY:  In order to MAX SHARE DroneDeploy and other "cloud processed" aerial imagery data products -- exported via KML-KMZ "containers" -- the key image map overlays must be stored "in the cloud" -- with "permanent" and hard public links.  Google KML-KMZ inherit the Google design premise that every file is really a web link or URL.

Good luck!  And happy value hunting via data explorations!

Monday, August 21, 2017

Solar Eclipse 21-Aug-2017 & Lizard Brain Surprise

Solar Eclipse 21-Aug-2017?  Only 61% Occultation?

Ho-Humm: I was slow to become excited about today's solar eclipse across the USA. After all -- what was the great mystery in a shadow crossing the earth?

But sometimes it is the little things that wake up a child like wonder and thrills.  And trigger surprise emotions & non-verbal brain signals from my Lizard Brain

Hours before the predicted solar eclipse max -- I did spend 25 mins building a shoebox "telescope" -- a miniature Camera Obscura for indirect observation of the 61% max solar eclipse shadowing south Texas.  



Camera Obscura Woodcut from Wikipedia

And I was enjoying eclipse 2017 geographical progress via video coverage on NASA TV Live via YouTube.  

One Possible NASA TV Live Logo

SURPRISE and AWE: But my interest and child like wonder -- and my Lizard Brain "triggers" -- these really peaked when I first stepped outside to test my shoe box "telescope" -- and looking down -- realized there was something really different: Countless magical Solar Crescents: Every sunbeam thru the trees was transformed into a little wonder of nature! Perhaps a photo to illustrate.

Stepping out to test shoebox "telescope and seeing
every sunbeam transformed into a Solar Crescent

INTELLECT vs LIZARD BRAIN: Yes, a solar eclipse seems trivial from an intellectual viewpoint. But your lizard brain is kinda surprised & startled when you realized something is very different about every sunbeam thru the trees.  This surprise emotion suggest why solar eclipses are so emotional for some -- and were great sources of terror in the past: Every part of our being is "hard-wired" to expect & assume the Sun is daily round -- and reliable.  When the Sun is not round & reliable -- the Lizard brain starts talking in brain signals -- under the surface of our conscious minds -- where words rarely traverse.  

Lizard Brain?  Perhaps an vehicle driving example.  On long road trips with little traffic -- your mind gets bored. Soon you are observing the passing scenery or listening to an interest radio program. 45 mins later -- you "wake up" and realize you have been driving -- with success -- no wrecks -- yet you cannot recall any specific driving details of the last 40-60 miles.  No specific conscious decisions about curves or speed or lane changes.  In essence -- the Lizard Brain "took over" the driving from your conscious mind -- a biological "autopilot."   Your conscious mind is only "called up" by the Lizard Brain -- if some driving problem manifest that was "too difficult" for the routine auto-pilot decision loops of the Lizard Brain.

MORE VAGUE LEARNING: Your Lizard Brain can and does learn many thinks -- often without specific, conscious instruction. Example? Do you recall learning how to crawl or walk or drink?  You did learn -- but now these skills are "just in there" -- "hard wired" -- deep into your Lizard Brain and body.

Solar Eclipse and Your Lizard Brain?  And your Lizard Brain is learning thousands of other daily details that almost never "boil up" to your conscious mind.  Many are "environmental triggers" for what is "normal" and "odd" -- environmental details that often keep us out of trouble & alive.  This included an unconscious "awareness" of how the Sun & Moon and stars "should be" on a regular basis.  And an unconscious awareness of what are "normal" sun shadows -- and when this normal is "broken" -- by what might be a danger or predator above us in the sky.

UPSHOT:  I was surprised by my internal emotional reaction to the crescent sunbeams thru the trees.  My Lizard Brain communicated -- not in words -- but via unspoken emotional triggers -- that something was "spooky different."  These emotional signals were BEYOND words and independent of my intellectual understand of the solar eclipse event.

As I reviewed my Solar Eclipse 2017 videos -- I caught myself "giddy" in my voice dialog.  As I thought back the many total eclipse sites covered on NASA TV and other TV networks -- and mixed reactions of folks on the ground -- their emotional reactions were surprisingly genuine and unforced.  Some just fell silent in the total eclipse darkness:


CBS News from Jackson Hole Wyoming,
On camera talent's genuine loss for words in
almost 100% total darkness.
(3 min 22 sec clip)

Some became giddy or child like in verbal expressions.  Based upon my experience -- I would bet that deep wired Lizards Brain reactions to the Sun going dark affect everyone -- in ways that surprise even the most careful & controlled internal conscious observer.

Below are a few simple videos from today.  Those who know me well can judge for themselves if my Lizard Brain was over stimulated.


Solar Eclipse at 61% & Shoebox 
"Camera Obscura" in Operation

Ad Hoc "Colander Telescope" and a
Solar Crescent from every hole.

Solar Eclipse 2017 at 61% Max Occultation
and countless Solar Crescents in every shadow
Enjoy!

Tuesday, August 15, 2017

Solar Eclipse 1918 Vestigial Memory vs 2017 Internet

Here is an odd three generation Syzygy -- an astronomical "connection" across three generations.  Cross Generational "memory" -- in an odd way: The Solar Eclipse of 08-June-1918 vs 21-Aug-2017

08-June-1918 Solar Eclipse painting by Howard
Russell Butler (from Wikipedia)

Take notice -- you genealogy searching fiends -- even astronomical events can "index" generational "knowledge" -- of time, place and perspectives.  Transcend living memory -- in and odd ways.

MOON ATE THE SUN in 1918?   As early teen -- back in the 1970s -- on a long road trip -- my maternal grandfather described how scared he was -- as a small child in Arkansas -- when the Sun began to go dark -- and the air grew cold on a full summer June day -- the wind shifted -- a few bright stars came out -- and the chickens began to goto roost -- and the cows drifted to the barn for milking.

My grandfather described how he was playing in the fields -- and ran home in a panic -- only to find his mother -- my great grandmother -- calmly standing on the porch of their simple wood cabin -- with two chunks of paper board -- one with a center pin hole -- and another unmolested -- watching a small bright dot shape shift.

My grandfather asked "What's going on Momma?" -- and she replied that the Moon was "eating" the Sun -- words that my grandfather could understand at the tender age of 8 or 9.  The Moon was casting a shadow across rural Arkansas. This was very rare.  My great-grandmother had read about the upcoming 1918 Solar Eclipse event in the local newspapers -- and constructed a crude "telescope" -- as described in one of the newspapers.

My favorite 1918 solar eclipse find, published in
Denver Post (from GreatAmericanEclipse.Com )
Key map detail? In 1918 my grandfather lived just 
south of "A" in ARKANSAS label - near line of totality.

How so?  How could old paper newspapers "know" that a solar eclipse would unfold?  Short answer:  Solar eclipse events are "hand calculable" -- centuries into the future -- and into the past.  Such is the stability of the Moon's orbit about the Earth -- and when the Moon will block the Sun.  Number crunching nerds have been "predicting" solar & lunar eclipse events since before the ancient Sumerians and Aztecs.

REAL or TALL TALE Memory?  For years -- my grandfather's verbal description of his childhood solar eclipse bugged me -- especially since my grandfather was such a big story teller -- and tended to blow things out of proportion -- just for humor & mischief.

THE INTERNET and SEARCH ENGINES -- One day it dawned on me -- "Hey, I can search for any full -- or partial -- solar eclipse across Arkansas in early 20th Century!" -- in a few mins of searching -- the following links -- with glorious "back calculated" pixs and maps -- came up.  How do I know this is "the eclipse"??  My maternal grandfather was born -- as he said "In 1909 -- way out back in the woods!" -- so a 1918 eclipse would make my grandfather a child of 8 or 9 -- mentally old enough to record a solar event.

80 Sec Video, very good Photo-Realistic simulation of
21-Aug-2017 solar eclipse. Note how stars come out.

STRANGE?  There is something odd -- for me -- looking at these 1918 solar eclipse maps:  I can pin-point -- both in time and map space -- where my grandfather was in June 1918 -- almost to the minute.  Kinda like having a Mr Peabody "way-back" machine -- where we can set the dials -- and go see the past.

Fun additional fact?  The 1918 Solar eclipsed was used to test for gravitational deviation of star light -- by the Sun -- as proposed in Albert Einstien and his theory of general relative.

NOW in 2017: In the lower 48 states -- we have opportunity to experience a very similar eclipse -- on 21-Aug-2017.  A chance to anchor some memories -- and share an immutable event with those that follow.   A few links follow -- with kewl graphics and charts -- to "tickle" your fancy.


Very kewl predictive map for 21-Aug-2017
Solar Eclipse (click for larger, from website
StaryNightEducation.Com )

UPSHOT?  Even poor white trash from Arkansas can "anchor" history -- with not-so-tall tales -- and a little ex-post-facto astronomical research.

Useful links and videos:

[1] NASA Count Down Timer to Live Eclipse 21-Aug-2017

[2] Wiki maps & pixs for 21-Aug-2017 Solar Eclipse