https://prancer.physics.louisville.edu/astrowiki/api.php?action=feedcontributions&user=WikiSysop&feedformat=atomAstroEdWiki - User contributions [en]2024-03-19T12:41:12ZUser contributionsMediaWiki 1.38.2https://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2610Main Page2022-08-18T06:59:08Z<p>WikiSysop: </p>
<hr />
<div>== Physics & Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This wiki provides supplemental support for Physics, Astronomy and Astrophysics course in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options and your Blackboard class site for instructions on its use.<br />
<br />
[https://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy_Laboratory_Activities Elementary Astronomy Laboratory Activities]<br />
<br />
<br />
[https://prancer.physics.louisville.edu/astrowiki/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2609Main Page2022-08-18T06:37:47Z<p>WikiSysop: </p>
<hr />
<div>== Physics & Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This wiki provides supplemental support for Physics, Astronomy and Astrophysics course in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options and your Blackboard class site for instructions on its use.<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Elementary_Astronomy_Laboratory_Activities Elementary Astronomy Laboratory Activities]<br />
<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2608Main Page2022-08-18T06:36:31Z<p>WikiSysop: </p>
<hr />
<div>== Physics & Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This wiki provides supplemental support for Physics, Astronomy and Astrophysics course in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options and your Blackboard class site for instructions on its use.<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Elementary_Astronomy_Laboratory_Activities]<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Elementary_Astronomy_Laboratory Elementary Astronomy Laboratory (108)]<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2607Main Page2022-08-18T06:35:19Z<p>WikiSysop: </p>
<hr />
<div>== Physics & Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This wiki provides supplemental support for Physics, Astronomy and Astrophysics course in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options and your Blackboard class site for instructions on its use.<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Elementary_Astronomy_Laboratory_Activities]<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/Elementary_Astronomy_Laboratory Elementary Astronomy Laboratory (108)]<br />
<br />
[https://prancer.physics.louisville.edu/mediawiki-1.38.2/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2604Main Page2020-08-01T19:11:03Z<p>WikiSysop: </p>
<hr />
<div>== Physics & Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This wiki provides supplemental support for Physics, Astronomy and Astrophysics course in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options and your Blackboard class site for instructions on its use.<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy Elementary Astronomy (107)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy_Laboratory Elementary Astronomy Laboratory (108)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Observational_Astronomy_(308) Observational Astronomy (308)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Research_Methods Research Methods (650)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Elementary_Astronomy_Laboratory_Activities&diff=2603Elementary Astronomy Laboratory Activities2020-08-01T19:07:07Z<p>WikiSysop: </p>
<hr />
<div>These activities for an Elementary Astronomy Lab are used in classes on campus at the University of Louisville and are being updated to match those made available to our mentored versions for labs that are offered [http://prancer.physics.louisville.edu/moodle online] as part of our Distance Education program. <br />
<br />
If you are enrolled in Physics and Astronomy 108-50, the Distance Education online class, please see your class website [http://prancer.physics.louisville.edu here]. If you are in a section usually taught on campus, check your Blackboard class page for instructions. <br />
<br />
The lab versions below have not yet been updated for Fall 2020.<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Identify_Constellations Identify Constellations ]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Immersive_Video_Wall About the Video Room]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Watch_the_Sky Watch the Sky (Planetarium session not currently offered)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Under_Namibian_Skies Under Namibian Skies (immersive visualization)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Night_Sky Night Sky Tonight Using Stellarium (immersive visualization)] <br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Remote_Telescope_Requests Use a Remote Telescope: Requests] and<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Remote_Telescope_Results Analyze Request Results]<br />
<br />
Travel to Mars, Jupiter, Saturn, and Uranus (immersive visualization)<br />
<br />
Survey galaxies in the universe (immersive visualization)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Survey_Galaxies_in_Virgo Survey Galaxies in Virgo]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/The_Earth_Rotates The Earth Rotates]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Our_Dynamic_Sun Our Dynamic Sun (may use the roof top solar telescope)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Light_and_Telescopes Light and Telescopes]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Experiment_with_CCD_Camera_Images Experiment with CCD Camera Images]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Use_a_CCD_Camera Use a CCD Camera]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Spectra Spectra]<br />
<br />
Observing planets and the Moon with a telescope (live remote or with the telescope on the roof)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Explore_Mars Explore Mars] (may use immersive visualiztion)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Observe_Satellites_of_Jupiter_and_Saturn Observe Satellites of Jupiter, Saturn and Uranus]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Follow_Proxima_Centauri Follow Proxima Centauri]<br />
<br />
Brightnesses and colors of stars in Messier 34<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Variable_Stars_in_Messier_3 Variable Stars in Messier 3]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Measure_a_Nearby_Supernova Measure a Nearby Supernova]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Track_Cosmic_Rays_in_a_Cloud_Chamber Track Cosmic Rays in a Cloud Chamber]<br />
<br />
== New Labs ==<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/FLIR_Camera The iPad and infrared camera]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Black_Body_and_Filters Stars as Black Bodies and Astronomical Filters]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Elementary_Astronomy_Laboratory&diff=2602Elementary Astronomy Laboratory2020-08-01T19:01:51Z<p>WikiSysop: </p>
<hr />
<div>The University of Louisville Department of Physics and Astronomy offers Elementary Astronomy laboratory sections on Belknap Campus in both Fall and Spring Semesters. For the Fall 2020 term with COVID-19 concerns for student well being, we are teaching all of the lab classes asynchronously on line. The class management may vary with section. Please check your Blackboard class for information on how your section will run.<br />
<br />
*108-01 to 108-08 and 108-75-76 in 2 hour labs, once each week at various times from 10 AM to 7 PM Tuesday through Thursday <br />
<br />
*108-50 online learning section through the University of Louisville and the [http://www.kyvc.org/ Kentucky Virtual Campus]<br />
<br />
Content for the sections of 108 that meet in a classroom is available <br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy_Laboratory_Activities here].<br />
<br />
Content for the distance education section is at our [http://prancer.physics.louisville.edu/moodle Distance Education website]<br />
<br />
All of our hands-on astronomy courses use the resources of [http://sharedskies.org Shared Skies].</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2601Main Page2020-08-01T18:59:30Z<p>WikiSysop: /* Astronomy Education at the University of Louisville */</p>
<hr />
<div>== Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This site provides information for courses in Astronomy and Astrophysics in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options. See your Blackboard class site for more information.<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy Elementary Astronomy (107)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy_Laboratory Elementary Astronomy Laboratory (108)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Observational_Astronomy_(308) Observational Astronomy (308)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Research_Methods Research Methods (650)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2600Observational Astronomy (308)2019-11-19T22:51:30Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
** [https://www.calsky.com/ Calsky]<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
** See [https://www.calsky.com/]<br />
* Determining planetary mass from spectra<br />
** See [https://en.wikipedia.org/wiki/Doppler_spectroscopy Doppler Spectoscopy] in Wikipedia<br />
<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* Transit of Mercury!!<br />
** Viewing in the planetarium garden area if it is clear<br />
** See [https://www.space.com/mercury-transit-2019-viewing-guide.html Viewing guide]<br />
<br />
<br />
== Week 14 (November 18) ==<br />
<br />
* Use of AstroImageJ to analyze data<br />
* Project data on our server at [https://www.astro.louisville.edu/shared_skies/archive/astrolab/ https://www.astro.louisville.edu/shared_skies/archive/astrolab/]<br />
<br />
If time available --<br />
<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future<br />
<br />
<br />
== Week 15 (November 25) ==<br />
<br />
* Presentations of projects</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2599Observational Astronomy (308)2019-11-19T22:48:28Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
** [https://www.calsky.com/ Calsky]<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
** See [https://www.calsky.com/]<br />
* Determining planetary mass from spectra<br />
** See [https://en.wikipedia.org/wiki/Doppler_spectroscopy Doppler Spectoscopy] in Wikipedia<br />
<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* Transit of Mercury!!<br />
** Viewing in the planetarium garden area if it is clear<br />
** See [https://www.space.com/mercury-transit-2019-viewing-guide.html Viewing guide]<br />
<br />
<br />
== Week 14 (November 18) ==<br />
<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2598Observational Astronomy (308)2019-11-04T20:52:31Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
** [https://www.calsky.com/ Calsky]<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
** See [https://www.calsky.com/]<br />
* Determining planetary mass from spectra<br />
** See [https://en.wikipedia.org/wiki/Doppler_spectroscopy Doppler Spectoscopy] in Wikipedia<br />
<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* Transit of Mercury!!<br />
** Viewing in the planetarium garden area if it is clear<br />
** See [https://www.space.com/mercury-transit-2019-viewing-guide.html Viewing guide]<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2597Observational Astronomy (308)2019-11-04T20:50:57Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
** [https://www.calsky.com/ Calsky]<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
** See [https://www.calsky.com/]<br />
* Determining planetary mass from spectra<br />
** See [https://en.wikipedia.org/wiki/Doppler_spectroscopy Doppler Spectoscopy] in Wikipedia<br />
<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2596Observational Astronomy (308)2019-11-04T20:50:08Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
** [https://www.calsky.com/ Calsky]<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
* Determining planetary mass from spectra<br />
** See [https://en.wikipedia.org/wiki/Doppler_spectroscopy Doppler Spectoscopy] in Wikipedia<br />
<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2595Observational Astronomy (308)2019-11-04T20:48:21Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
** [https://www.calsky.com/ Calsky]<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
* Determining planetary mass from spectra<br />
<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2594Observational Astronomy (308)2019-11-04T20:46:25Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities<br />
<br />
<br />
== Week 12 (November 4) ==<br />
<br />
* Latest data on GEO satellites and TESS targets<br />
* Determining planetary mass from spectra<br />
<br />
== Week 13 (November 11) ==<br />
<br />
* High spatial resolution imaging<br />
** Lucky imaging<br />
** Adaptive optics<br />
** Role in exoplanet discovery<br />
** Interferometry present and future</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2593Observational Astronomy (308)2019-10-21T17:40:46Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics (revised 2019-10-21)<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - TESS and expolanets<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic- TESS and exoplanets<br />
** Dylan Scharff - Supernovae<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Research by topic<br />
** TESS and exoplanets<br />
*** Travis Waters, Shawn Knabel, Dominic Smith, Michael Turner, Ben Kantardzic<br />
** Supernovae<br />
*** Dylan Scharff<br />
** Orion star formation<br />
*** Chris Andersen<br />
** Geosynchronous satellites<br />
*** Natalie Warning<br />
<br />
* This week at Moore Observatory<br />
** Tuesday night likely clear<br />
** Wednesday night long range partly cloudy<br />
** Sunday night long range favorable<br />
<br />
* Noise in signals<br />
** Gaussian random processes, photons, and Poisson noise<br />
** Assessing sources of systematic error and noise in photometry<br />
<br />
<br />
<br />
== Week 11 (October 28) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2592Observational Astronomy (308)2019-10-14T14:54:17Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics<br />
** Natalie Warning - Geosynchronous satellites<br />
** Shawn Knabel, Dominic Smith, Travis Waters - TESS exoplanets<br />
** Michael Turner - Supernovae<br />
** Christopher Anderson - Orion star formation<br />
** Benjamin Kantardzic<br />
** Dylan Scharff<br />
<br />
<br />
* Exoplanet transit photometry of a TESS candidate<br />
** Full frame images<br />
** Candidates<br />
** Validation - night of October 13, 2019<br />
<br />
<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2591Observational Astronomy (308)2019-10-14T14:46:26Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations<br />
<br />
<br />
== Week 7 (September 30) ==<br />
<br />
* Continued discussion of research and optical astronomy data<br />
<br />
<br />
== Week 8 (October 7) ==<br />
<br />
* Fall break<br />
<br />
<br />
<br />
== Week 9 (October 14) ==<br />
<br />
* Final organization of research topics<br />
* Exoplanet transit photometry of a TESS candidate<br />
<br />
<br />
== Week 10 (October 21) ==<br />
<br />
* Stellar spectra<br />
* Radial velocities</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2590Observational Astronomy (308)2019-09-23T17:29:51Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Near-Earth objects <br />
** Mercury transit on November 11 (with an exoplanet transit at night backup)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2589Observational Astronomy (308)2019-09-23T00:42:49Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 16) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
<br />
== Week 6 (September 23) ==<br />
<br />
* Clear weather tonight possible (Monday) and the coming weekend<br />
<br />
* Roundtable discussion about project ideas (bring your own) such as<br />
** Jupiter imaging (must be done soon)<br />
** Bright star photometry (examples are some TESS candidates and zeta Andromedae now)<br />
** Use of latest CMOS color sensors for photometry <br />
** Other TESS candidates (TESS is currently observing the northern sky)<br />
** Variable stars in the TESS public data<br />
** Comets [http://www.aerith.net/comet/weekly/current.html http://www.aerith.net/comet/weekly/current.html]<br />
** Supernovae [http://www.rochesterastronomy.org/supernova.html http://www.rochesterastronomy.org/supernova.html]<br />
** Anything in Orion (visible after midnight now) <br />
** Geosynchronous (GEO) satellites (where, when, optical variability)<br />
** Low Earth orbit (LEO) satellites (wide field camera, untracked)<br />
** Other selected targets in either hemisphere, your choice<br />
<br />
To continue this week and next as time allows<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts<br />
** Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
** Examples: TESS data and followup with ground-based observations</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2588Observational Astronomy (308)2019-09-16T17:40:19Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 15) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Epsilon Lyrae (Double Double) see [https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/ https://www.astro.louisville.edu/shared_skies/archive/select/stars/lyra/epsilon_lyrae/20060909/]<br />
<br />
* Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2587Observational Astronomy (308)2019-09-16T17:36:58Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 15) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* Examples: Jupiter see [https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/ https://www.astro.louisville.edu/shared_skies/archive/select/planets/jupiter/20140313/]<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images <br />
** Photometry<br />
** Advanced concepts</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2583Observational Astronomy (308)2019-09-09T17:34:15Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 15) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* [https://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ]<br />
** Installation<br />
** Use with simple images<br />
** Photometry<br />
** Advanced concepts</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2582Observational Astronomy (308)2019-09-09T17:31:46Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]<br />
<br />
<br />
<br />
== Week 3 (September 2) ==<br />
<br />
* Labor Day holiday<br />
<br />
<br />
== Week 4 (September 9) ==<br />
<br />
* Telescopes<br />
** Basic concepts of optical telescope design<br />
** Light gathering function<br />
** Focal plane scale<br />
** Angular resolution and point spread function<br />
** Detectors and filters<br />
<br />
* Visit to the Planetarium<br />
** Solar projection telescope<br />
** Solar imaging in hydrogen alpha light<br />
<br />
<br />
== Week 5 (September 15) ==<br />
<br />
* Telescopes continued from last week as needed<br />
<br />
* AstroImageJ<br />
** Installation<br />
** Use with simple images<br />
** Photometry<br />
** Advanced concepts</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2572Observational Astronomy (308)2019-08-26T16:07:01Z<p>WikiSysop: </p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live session<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Main_Page&diff=2571Main Page2019-08-26T16:00:46Z<p>WikiSysop: /* Astronomy Education at the University of Louisville */</p>
<hr />
<div>== Astronomy Education at the University of Louisville ==<br />
<br />
<br />
This site provides information for courses in Astronomy and Astrophysics in the Department of Physics and Astronomy at the University of Louisville. Click one of the links for more options.<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy Elementary Astronomy (107)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Elementary_Astronomy_Laboratory Elementary Astronomy Laboratory (108)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Observational_Astronomy_(308) Observational Astronomy (308)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Research_Methods Research Methods (650)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Special:AllPages An index of all pages.]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Observational_Astronomy_(308)&diff=2570Observational Astronomy (308)2019-08-26T15:59:46Z<p>WikiSysop: Notes on content to the Physics and Astronomy 308 class Observational Astronomy</p>
<hr />
<div>These notes are brief summaries and links for the in-class content for the Monday class meetings of the Fall Semester 2019.<br />
<br />
<br />
== Week 1 (August 19) ==<br />
<br />
* [http://prancer.physics.louisville.edu/classes/syllabus/p308_fa19.pdf Orientation to the class]<br />
* Purpose, activities, outcomes<br />
* [https://www.astro.louisville.edu/moore/directions/ Visiting the observatory]<br />
* Remote observing<br />
* Semester projects<br />
* [https://www.astro.louisville.edu/presentations/ul_20190221.pdf Facilities and research] <br />
* On-line with Mt. Kent live sessino <br />
<br />
<br />
== Week 2 (August 26) ==<br />
<br />
* What we observe<br />
** Position and celestial coordinates<br />
** Flux and magnitude<br />
* What we infer<br />
** Distance<br />
** Size<br />
** Luminosity<br />
** Composition<br />
** Evolution<br />
* Useful tools for access to data<br />
** [https://stellarium.org/ Stellarium]<br />
** [http://simbad.u-strasbg.fr/simbad/ Simbad]<br />
** [http://aladin.u-strasbg.fr/aladin.gml Aladin]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Results&diff=2561Remote Telescope Results2019-01-01T21:19:57Z<p>WikiSysop: </p>
<hr />
<div><br />
<center><br />
'''!! Notice !!'''<br />
<br><br />
This is part of an activity that is no longer available to classes on campus.<br />
</center><br />
<br />
<br />
== Retrieving Your Data ==<br />
<br />
<br />
The data we have available in response to your requests are on the server website at<br />
<br />
<center>[http://www.astro.louisville.edu/shared_skies/archive/ http://www.astro.louisville.edu/shared_skies/archive/]</center><br />
<br />
You may be asked for a username and password to use the astrolab data, and if so enter<br />
<br />
*User: xxx<br />
*Password: xxx<br />
<br />
all in lower case.<br />
<br />
<br />
However we encourage you to use the open archive of public data and select from it the best material to support your interests.<br />
<br />
<br />
<br />
<center>[http://www.astro.louisville.edu/shared_skies/archive/select/ Open access selected data]</center><br />
<br />
<br />
<br />
Image files ending in "fits" must be downloaded to view in AstroImageJ, ds9, or Aladin. Other files may be viewed on the web.<br />
<br />
Find an image file and ''right click'' on the name. Save the data to your own computer for use later. The "fits" files are astronomical image data types and usually very large so download will be slow. ImageJ is generally useful for all types of images, but you may find that ds9, which shows only astronomical FITS images, is a tool you like as well. Use what works best for you.<br />
<br />
<br />
== Using AstroImageJ ==<br />
<br />
<br />
AstroImageJ allows you to view files of all types. It is particularly good for working with all astronomical image data. You can experiment with it -- the difficulties of software are part of the lab experience!<br />
<br />
AstroImageJ is installed on the computers in the astronomy lab. If you prefer, to run a version on your notebook you may click below to go to the link<br />
<br />
<center>[http://www.astro.louisville.edu/software/astroimagej/ http://www.astro.louisville.edu/software/astroimagej]</center><br />
<br />
for a version you can download to run on Windows, Apple, or Linux computers.<br />
<br />
Once AstroImageJ has started, select "File" from its menu, "Open", and find the images you have downloaded. You might review your work other lab activities to see what the different controls will do.<br />
<br />
AstroImageJ offers many image processing options, and allows you to build color images from individual images in each color. You could also use Aladin or SAOImage ds9 for viewing images, but they are less versatile for processing the images and making measurements.<br />
<br />
<br />
== Using SAOImage ds9 ==<br />
<br />
SAOImage DS9 is an astronomical imaging and data visualization program that is widely used for research. It is installed on the lab computers and you may find it the best way to view and measure astronomical FITS files. It is free software, and can be installed on Mac, Linux and Windows computers if you prefer to run it on your own. It is not a web application, and the files you use it with must downloaded to your computer first. For more information if you are working outside the lab, go to this link<br />
<br />
<center>[http://hea-www.harvard.edu/RD/ds9/ http://hea-www.harvard.edu/RD/ds9/]</center><br />
<br />
<br />
== Using Aladin ==<br />
<br />
<br />
Aladin is ideal for viewing most fits files because it handles astronomical coordinates, and also allows you to overlay images from different sources. However, it does not do image processing particularly well, and if you want to modify an image a lot you may need ImageJ. The link to Aladin is<br />
<br />
<center>[http://www.astro.louisville.edu/software/aladin/ http://www.astro.louisville.edu/software/aladin/]</center><br />
<br />
Use "File" and "Open local file" in the Aladin menu to view an image you have already downloaded. You may install Aladin on your computer. It is safe, free, and reliable.<br />
<br />
<br />
== In the Lab ==<br />
<br />
Although the software will run on your notebook or home computer, and the data are available over the network, we ask that you do the work in the lab so that the assistant can help, and you can discuss your ideas with other students. You must submit your results in the lab that day. <br />
<br />
<br />
== What to Do ==<br />
<br />
If you submitted a request for data, we have tried to get it for you this semester. In some cases the requests could not be met because the objects were too bright (a very bright star for example), or too close to the Sun to view at this time. Also requests for objects in the solar system would usually duplicate our scheduled recording of the bright planets and asteroids. For these lab activities we have combined our most recent data in the archive indexed by object name, and dates. You can access any data in the archive by following the link given above.<br />
<br />
Look at what is available, think about what you asked for, and decide what question you want to explore. If you did not submit a request, or if the data you hoped for are not available, think over the possibilities, pose a new question, and use what you have.<br />
<br />
You might begin by comparing the data with what you can find on the Internet too, perhaps in Wikipedia or an image search, but remember that the focus should be on the data from our telescopes. It will be quite different from the pretty pictures you may get from the Hubble Telescope or press releases from ESO. <br />
<br />
To give you some ideas, here are questions you might seek answers to:<br />
<br />
In Messier 1, there are two central stars. Which one is a Pulsar? Does it have a different color from the other one. Why are the filaments red? Why is the fuzzy nebula "gray"? If this is the remnant of a supernova that occurred in 1054 AD, what is its 3-dimensional shape (you are only seeing it projected onto the sky)?<br />
<br />
In NGC 7662, what 3-dimensional shape could make the object look like this? There are images taken in filters isolating light from hydrogen, sulfur, and oxygen. Is there a difference in where these features appear in the nebula? Measure how large NGC 7662 is in diameter on the sky (an angle), and look up its distance with help from Google. See if you can figure out how large in diameter it must be compared to our solar system. (Your assistant may help if you get stuck.)<br />
<br />
For the Pleiades, Messier 45, how would you decide the distance to the stars, and how will that distance affect their appearance in the images?<br />
<br />
For the Moon, look for famous named craters. Find Copernicus, Tycho, Plato, Mare Imbrium, the lunar Appenines, and Sinus Iridum. How big are they? That is, how many kilometers across are they? Why are shadows longer for craters and mountains close to the "terminator", the line that divides the light and dark sides. Are the craters that are near the top or bottom (north or south) really oval, and if not, why do they appear to be oval? How did the floor of Mare Imbrium or of Plato become so free of craters? Find other images of the Moon on the web and compare them to this one. Can you see more or less of the Moon toward the edges of the disk? Why is that?<br />
<br />
This unit is an open ended inquiry. Start with the data we have provided and see where it takes you. Describe what you did and your conclusions in your response. Remember that typically discovery-based science generates new questions, and you may suggest other inquiries as part of your conclusion. Even if you work in small groups in the lab, each student must submit their own work at the end of the lab period.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Results&diff=2560Remote Telescope Results2019-01-01T21:19:28Z<p>WikiSysop: </p>
<hr />
<div><br />
<center><br />
'''!! Notice !!'''<br />
<br><br />
This is part of an activity that is no longer available to classes on campus.<br />
</center><br />
<br />
<br />
== Retrieving Your Data ==<br />
<br />
<br />
The data we have available in response to your requests are on the server website at<br />
<br />
<center>[http://www.astro.louisville.edu/shared_skies/archive/ http://www.astro.louisville.edu/shared_skies/archive/]</center><br />
<br />
You may be asked for a username and password to use the astrolab data, and if so enter<br />
<br />
*User: xxx<br />
*Password: xxx<br />
<br />
all in lower case.<br />
<br />
<br />
However we encourage you to use the open archive of public data and select from it the best material to support your interests.<br />
<br />
<br />
<br />
<center>[http://www.astro.louisville.edu/shared_skies/archive/select/ Open access elected data]</center><br />
<br />
<br />
<br />
Image files ending in "fits" must be downloaded to view in AstroImageJ, ds9, or Aladin. Other files may be viewed on the web.<br />
<br />
Find an image file and ''right click'' on the name. Save the data to your own computer for use later. The "fits" files are astronomical image data types and usually very large so download will be slow. ImageJ is generally useful for all types of images, but you may find that ds9, which shows only astronomical FITS images, is a tool you like as well. Use what works best for you.<br />
<br />
<br />
== Using AstroImageJ ==<br />
<br />
<br />
AstroImageJ allows you to view files of all types. It is particularly good for working with all astronomical image data. You can experiment with it -- the difficulties of software are part of the lab experience!<br />
<br />
AstroImageJ is installed on the computers in the astronomy lab. If you prefer, to run a version on your notebook you may click below to go to the link<br />
<br />
<center>[http://www.astro.louisville.edu/software/astroimagej/ http://www.astro.louisville.edu/software/astroimagej]</center><br />
<br />
for a version you can download to run on Windows, Apple, or Linux computers.<br />
<br />
Once AstroImageJ has started, select "File" from its menu, "Open", and find the images you have downloaded. You might review your work other lab activities to see what the different controls will do.<br />
<br />
AstroImageJ offers many image processing options, and allows you to build color images from individual images in each color. You could also use Aladin or SAOImage ds9 for viewing images, but they are less versatile for processing the images and making measurements.<br />
<br />
<br />
== Using SAOImage ds9 ==<br />
<br />
SAOImage DS9 is an astronomical imaging and data visualization program that is widely used for research. It is installed on the lab computers and you may find it the best way to view and measure astronomical FITS files. It is free software, and can be installed on Mac, Linux and Windows computers if you prefer to run it on your own. It is not a web application, and the files you use it with must downloaded to your computer first. For more information if you are working outside the lab, go to this link<br />
<br />
<center>[http://hea-www.harvard.edu/RD/ds9/ http://hea-www.harvard.edu/RD/ds9/]</center><br />
<br />
<br />
== Using Aladin ==<br />
<br />
<br />
Aladin is ideal for viewing most fits files because it handles astronomical coordinates, and also allows you to overlay images from different sources. However, it does not do image processing particularly well, and if you want to modify an image a lot you may need ImageJ. The link to Aladin is<br />
<br />
<center>[http://www.astro.louisville.edu/software/aladin/ http://www.astro.louisville.edu/software/aladin/]</center><br />
<br />
Use "File" and "Open local file" in the Aladin menu to view an image you have already downloaded. You may install Aladin on your computer. It is safe, free, and reliable.<br />
<br />
<br />
== In the Lab ==<br />
<br />
Although the software will run on your notebook or home computer, and the data are available over the network, we ask that you do the work in the lab so that the assistant can help, and you can discuss your ideas with other students. You must submit your results in the lab that day. <br />
<br />
<br />
== What to Do ==<br />
<br />
If you submitted a request for data, we have tried to get it for you this semester. In some cases the requests could not be met because the objects were too bright (a very bright star for example), or too close to the Sun to view at this time. Also requests for objects in the solar system would usually duplicate our scheduled recording of the bright planets and asteroids. For these lab activities we have combined our most recent data in the archive indexed by object name, and dates. You can access any data in the archive by following the link given above.<br />
<br />
Look at what is available, think about what you asked for, and decide what question you want to explore. If you did not submit a request, or if the data you hoped for are not available, think over the possibilities, pose a new question, and use what you have.<br />
<br />
You might begin by comparing the data with what you can find on the Internet too, perhaps in Wikipedia or an image search, but remember that the focus should be on the data from our telescopes. It will be quite different from the pretty pictures you may get from the Hubble Telescope or press releases from ESO. <br />
<br />
To give you some ideas, here are questions you might seek answers to:<br />
<br />
In Messier 1, there are two central stars. Which one is a Pulsar? Does it have a different color from the other one. Why are the filaments red? Why is the fuzzy nebula "gray"? If this is the remnant of a supernova that occurred in 1054 AD, what is its 3-dimensional shape (you are only seeing it projected onto the sky)?<br />
<br />
In NGC 7662, what 3-dimensional shape could make the object look like this? There are images taken in filters isolating light from hydrogen, sulfur, and oxygen. Is there a difference in where these features appear in the nebula? Measure how large NGC 7662 is in diameter on the sky (an angle), and look up its distance with help from Google. See if you can figure out how large in diameter it must be compared to our solar system. (Your assistant may help if you get stuck.)<br />
<br />
For the Pleiades, Messier 45, how would you decide the distance to the stars, and how will that distance affect their appearance in the images?<br />
<br />
For the Moon, look for famous named craters. Find Copernicus, Tycho, Plato, Mare Imbrium, the lunar Appenines, and Sinus Iridum. How big are they? That is, how many kilometers across are they? Why are shadows longer for craters and mountains close to the "terminator", the line that divides the light and dark sides. Are the craters that are near the top or bottom (north or south) really oval, and if not, why do they appear to be oval? How did the floor of Mare Imbrium or of Plato become so free of craters? Find other images of the Moon on the web and compare them to this one. Can you see more or less of the Moon toward the edges of the disk? Why is that?<br />
<br />
This unit is an open ended inquiry. Start with the data we have provided and see where it takes you. Describe what you did and your conclusions in your response. Remember that typically discovery-based science generates new questions, and you may suggest other inquiries as part of your conclusion. Even if you work in small groups in the lab, each student must submit their own work at the end of the lab period.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Results&diff=2559Remote Telescope Results2019-01-01T21:18:46Z<p>WikiSysop: </p>
<hr />
<div><br />
<center><br />
'''!! Notice !!'''<br />
<br><br />
This is part of an activity that is no longer available to classes on campus.<br />
</center><br />
<br />
<br />
== Retrieving Your Data ==<br />
<br />
<br />
The data we have available in response to your requests are on the server website at<br />
<br />
<center>[http://www.astro.louisville.edu/shared_skies/archive/ http://www.astro.louisville.edu/shared_skies/archive/]</center><br />
<br />
You may be asked for a username and password to use the astrolab data, and if so enter<br />
<br />
*User: astrolab<br />
*Password: asteroid<br />
<br />
all in lower case.<br />
<br />
<br />
However we encourage you to use the open archive of public data and select from it the best material to support your interests.<br />
<br />
<br />
<br />
<center>[http://www.astro.louisville.edu/shared_skies/archive/select/ Open access elected data]</center><br />
<br />
<br />
<br />
Image files ending in "fits" must be downloaded to view in AstroImageJ, ds9, or Aladin. Other files may be viewed on the web.<br />
<br />
Find an image file and ''right click'' on the name. Save the data to your own computer for use later. The "fits" files are astronomical image data types and usually very large so download will be slow. ImageJ is generally useful for all types of images, but you may find that ds9, which shows only astronomical FITS images, is a tool you like as well. Use what works best for you.<br />
<br />
<br />
== Using AstroImageJ ==<br />
<br />
<br />
AstroImageJ allows you to view files of all types. It is particularly good for working with all astronomical image data. You can experiment with it -- the difficulties of software are part of the lab experience!<br />
<br />
AstroImageJ is installed on the computers in the astronomy lab. If you prefer, to run a version on your notebook you may click below to go to the link<br />
<br />
<center>[http://www.astro.louisville.edu/software/astroimagej/ http://www.astro.louisville.edu/software/astroimagej]</center><br />
<br />
for a version you can download to run on Windows, Apple, or Linux computers.<br />
<br />
Once AstroImageJ has started, select "File" from its menu, "Open", and find the images you have downloaded. You might review your work other lab activities to see what the different controls will do.<br />
<br />
AstroImageJ offers many image processing options, and allows you to build color images from individual images in each color. You could also use Aladin or SAOImage ds9 for viewing images, but they are less versatile for processing the images and making measurements.<br />
<br />
<br />
== Using SAOImage ds9 ==<br />
<br />
SAOImage DS9 is an astronomical imaging and data visualization program that is widely used for research. It is installed on the lab computers and you may find it the best way to view and measure astronomical FITS files. It is free software, and can be installed on Mac, Linux and Windows computers if you prefer to run it on your own. It is not a web application, and the files you use it with must downloaded to your computer first. For more information if you are working outside the lab, go to this link<br />
<br />
<center>[http://hea-www.harvard.edu/RD/ds9/ http://hea-www.harvard.edu/RD/ds9/]</center><br />
<br />
<br />
== Using Aladin ==<br />
<br />
<br />
Aladin is ideal for viewing most fits files because it handles astronomical coordinates, and also allows you to overlay images from different sources. However, it does not do image processing particularly well, and if you want to modify an image a lot you may need ImageJ. The link to Aladin is<br />
<br />
<center>[http://www.astro.louisville.edu/software/aladin/ http://www.astro.louisville.edu/software/aladin/]</center><br />
<br />
Use "File" and "Open local file" in the Aladin menu to view an image you have already downloaded. You may install Aladin on your computer. It is safe, free, and reliable.<br />
<br />
<br />
== In the Lab ==<br />
<br />
Although the software will run on your notebook or home computer, and the data are available over the network, we ask that you do the work in the lab so that the assistant can help, and you can discuss your ideas with other students. You must submit your results in the lab that day. <br />
<br />
<br />
== What to Do ==<br />
<br />
If you submitted a request for data, we have tried to get it for you this semester. In some cases the requests could not be met because the objects were too bright (a very bright star for example), or too close to the Sun to view at this time. Also requests for objects in the solar system would usually duplicate our scheduled recording of the bright planets and asteroids. For these lab activities we have combined our most recent data in the archive indexed by object name, and dates. You can access any data in the archive by following the link given above.<br />
<br />
Look at what is available, think about what you asked for, and decide what question you want to explore. If you did not submit a request, or if the data you hoped for are not available, think over the possibilities, pose a new question, and use what you have.<br />
<br />
You might begin by comparing the data with what you can find on the Internet too, perhaps in Wikipedia or an image search, but remember that the focus should be on the data from our telescopes. It will be quite different from the pretty pictures you may get from the Hubble Telescope or press releases from ESO. <br />
<br />
To give you some ideas, here are questions you might seek answers to:<br />
<br />
In Messier 1, there are two central stars. Which one is a Pulsar? Does it have a different color from the other one. Why are the filaments red? Why is the fuzzy nebula "gray"? If this is the remnant of a supernova that occurred in 1054 AD, what is its 3-dimensional shape (you are only seeing it projected onto the sky)?<br />
<br />
In NGC 7662, what 3-dimensional shape could make the object look like this? There are images taken in filters isolating light from hydrogen, sulfur, and oxygen. Is there a difference in where these features appear in the nebula? Measure how large NGC 7662 is in diameter on the sky (an angle), and look up its distance with help from Google. See if you can figure out how large in diameter it must be compared to our solar system. (Your assistant may help if you get stuck.)<br />
<br />
For the Pleiades, Messier 45, how would you decide the distance to the stars, and how will that distance affect their appearance in the images?<br />
<br />
For the Moon, look for famous named craters. Find Copernicus, Tycho, Plato, Mare Imbrium, the lunar Appenines, and Sinus Iridum. How big are they? That is, how many kilometers across are they? Why are shadows longer for craters and mountains close to the "terminator", the line that divides the light and dark sides. Are the craters that are near the top or bottom (north or south) really oval, and if not, why do they appear to be oval? How did the floor of Mare Imbrium or of Plato become so free of craters? Find other images of the Moon on the web and compare them to this one. Can you see more or less of the Moon toward the edges of the disk? Why is that?<br />
<br />
This unit is an open ended inquiry. Start with the data we have provided and see where it takes you. Describe what you did and your conclusions in your response. Remember that typically discovery-based science generates new questions, and you may suggest other inquiries as part of your conclusion. Even if you work in small groups in the lab, each student must submit their own work at the end of the lab period.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Requests&diff=2558Remote Telescope Requests2019-01-01T21:16:45Z<p>WikiSysop: </p>
<hr />
<div><center><br />
'''!! Notice !!'''<br />
<br><br />
This option is no longer available to classes on campus. <br />
</center><br />
<br />
<br />
<br />
== Remote Telescope Data ==<br />
<br />
''Please allow time for us to respond to your request.''<br />
''This activity should be done before midterm and the second part will be at the end of the semester.''<br />
<br />
<br />
Today’s computer technology, modern optics, and electronics allow astronomers and students to "see" the sky without enduring long freezing nights looking through an eyepiece. The data we acquire are quantitative measurements of how much light arrives from fields of view that can be tiny and cover only a few stars, or vast, and cover constellations. They can tell you the positions of moving objects, the variations in their brightness, their colors, details of the structure, and how they interact or relate with nearby companions. Modern astronomy is evolving in the development of a new class of very large telescopes that by mid-2020's will have public data over the entire sky updated almost nightly. Smaller telescopes and instruments in space follow up on the discoveries offered by these big telescopes with closer inspection and careful analysis.<br />
<br />
If you are interested in this future world, you can read more at websites for Gaia, a new satellite beginning to return amazing measurements of distances throughout our own galaxy, and of LSST, a "synoptic survey" telescope under construction in Chile.<br />
<br />
*[https://www.lsst.org/ Link to LSST at https://www.lsst.org/ ]<br />
*[http://sci.esa.int/gaia/ Link to Gaia at http://sci.esa.int/gaia/]<br />
<br />
<br />
<br />
The University of Louisville and the University of Southern Queensland operate remotely controlled telescopes at Moore Observatory near Brownsboro, Kentucky, at Mount Kent Observatory near Toowoomba, Australia, and on Mt. Lemmon in Arizona. <br />
For more than 10 years we have been acquiring data for research and student use, and we have an archive of many of the bright objects that may interest you. <br />
This semester our telescopes in Kentucky, Arizona, and Australia are operating robotically under the supervision of professional astronomers and can also acquire new images to meet your requests as well. <br />
<br />
This week your lab activity is to thoughtfully identify something in the sky that really grabs your interest. It can be anything, but because there are limitations on what we can observe we will make a few suggestions to guide your thoughts. With that, ask some questions yourself -- we are reversing roles here! Why would you like to see this object, and what would you like to learn about it that a telescope image or measurement could reveal?<br />
<br />
<br />
== Available Resources ==<br />
<br />
<br />
The observatories are described on our website at<br />
<br />
[http://sharedskies.org http://sharedskies.org]<br />
<br />
<br />
and you are encouraged to explore it to see about the observatories and their telescopes. The content is currently under development, and may change during the semester.<br />
<br />
The facilities we have access to include:<br />
<br />
* Moore Observatory in the northern hemisphere near Brownsboro, Kentucky<br />
* Mt. Kent Observatory in the southern hemisphere near Toowoomba, Australia<br />
* Mt. Lemmon Observatory in Arizona, near Tucson<br />
<br />
<br />
At both observatories we have identical 0.5 meter (20 inch) diameter "CDK20" research telescopes operating with a CCD camera and a selection of filters. These are the primary telescopes for this program. We offer other instruments too, and we are developing the capability to have you visit us through the web when the telescopes are in operation and to press the "shutter button" yourself. For example there are <br />
<br />
* A fast wide field Shared Skies Live telescopes at Moore and Mt. Kent Observatory to take images of stars and nebulae<br />
* A special long-focus telescope at Moore Observatory to take images of planets.<br />
* Color cameras to quickly image constellations, or comets that may span many degrees.<br />
* Two 0.6 meter (24 inch) "RC24" research telescope at Moore Observatory and Mt. Lemmon are used primarily to study planets around other stars.<br />
* A 0.7 meter (27 inch) "CDK700" telescope at Mt. Kent observatory that is currently observing extrasolar planets discovered with the new NASA TESS satellite.<br />
<br />
<br />
Also we have a substantial improving archive of the best images that we add to often. We will draw from it when we can to satisfy your requests.<br />
<br />
<br />
== What the Telescopes Can Show ==<br />
<br />
<br />
Except the color cameras used to record wide field images of constellations and the occasional comet, the CCD cameras on the telescopes return scientific images as digital files in “FITS” format. These images may be viewed in your computer's browser with some tools we will provide, or with other software you load on your own computer. those images require some effort to analyze, but they offer quantitative data on positions and brightnesses. Over the next few weeks you will be using some of the software that is needed and gain experience with what it can do.<br />
<br />
In many cases we will produce color images from these data that may be useful if you are interested in seeing form or structure, watching craters or shadows on the Moon, following the rotation of Saturn, Jupiter or Mars, or looking for the colors of stars. The color images often are not "true" color but a combination of different filters to highlight some aspects of the objects of interest. You can make them too by working with the original data.<br />
<br />
Except for the special wide field cameras, these telescopes cover a field less than about half a degree across, the apparent size of the Moon. Each pixel resolves about 0.5 arcsecond (there are 3600 arcseconds in a degree), and measures how much light arrived at that spot in the image for the duration of the exposure. By comparing one pixel with another, you can tell how much brighter or fainter one feature of the image is compared to another. The image that you see, while visually exciting, is also a tool to measure how much light there is and where it is.<br />
<br />
Some images may be returned to you with a calibration for the position in the sky, and as you explore them you can measure the celestial coordinates of any point in the image. In this way, you can identify individual stars, clusters, nebulae, and galaxies in the images. You can follow the changing positions of asteroids and satellites of planets. You can spot a new supernova, watch variable stars, measure the separation of double stars, measure the diameter of a distant galaxy, or follow a new comet. <br />
<br />
<br />
<br />
== Filters, Sensitivity, and Spatial Resolution ==<br />
<br />
<br />
With the scientific cameras each image is taken through a filter that isolates a narrow band of the spectrum by passing the light through a filter that removes all but the part we want to record. If you looked through one of the filters, you would see only part of the light that is collected by the telescope. A "blue" filter would should light our eyes sense as blue, while an "infrared" filter would should light our eyes cannot see at all. Here’s how we designate the filters that are usually available by the wavelengths they transmit:<br />
<br />
* Blue-Green: g' (400 to 530 nm)<br />
* Yellow-Red: r'(530 to 700 nm)<br />
* Hydrogen: H-alpha (656 nm)<br />
* Red-Near Infrared: i' (700 to 825 nm)<br />
* Infrared: z' (825 1100 nm)<br />
<br />
<br />
The numbers are the wavelength of light in the spectrum, from blue light at 450 nanometers (nm) to red light at 650 nm (nm). Infrared has a longer wavelength than red, and ultraviolet has a shorter wavelength than blue. Since our telescopes respond well to "near" infrared light, out to 1100 nm, but not well to ultraviolet light, we observe in the bands that are most efficiently detected by the optics and electronics. <br />
<br />
A measurement with different filters allows us to determine the “color” of a star, or we can put images from blue, green, and red together to make a color image that will resemble what you would see if your eyes could detect this faint light. Images through a filter that isolates light emitted by hydrogen gas would show few stars, and if you could "see" through this filter the scene would look dark red and show only the gas. You may read more about all the filters that are available here.<br />
<br />
The faintest stars you will find in most images are about 18th magnitude. These stars are more than 10,000,000 times fainter that the brightest stars in the night sky. Typically the telescoped cover a field of about 1/2 degree, the diameter of the full Moon, but in some cases it can be smaller to see very fine detail,or larger, to get an entire constellation or a big comet. If the air is steady, the smallest detail you will see is about 1 arcsecond across. For comparison, Jupiter appears about 40 arcseconds across in our sky, and the Andromeda galaxy extends thousands of arcseconds. Some of our planetary and lunar images show detail as small as 0.3 arcseconds, the resolution limit of our telescopes in perfectly stable air.<br />
<br />
<br />
== Objects == <br />
<br />
Within our solar system you could expect to see<br />
<br />
* Solar System<br />
** the occasional bright comet and many very faint ones<br />
** changing phases of the Moon, and its libration<br />
** craters on the Moon and shadows that move during the night<br />
** "earthshine", the light on the dark side of the Moon reflected from Earth<br />
** Venus, Mars, Jupiter, Saturn, Uranus and Neptune moving nightly across the sky with respect to stars<br />
** polar caps and large features on Mars<br />
** satellites of Jupiter, Saturn, Uranus and Neptune moving nightly<br />
** changing atmospheric features on Jupiter and Saturn<br />
** rings of Saturn<br />
** asteroids<br />
** the brighter dwarf planets like nearby Ceres and distant Pluto<br />
<br />
Within our Milky Way galaxy you could see<br />
<br />
* Galactic <br />
** star birth nebulae<br />
** open clusters of young stars<br />
** active stars that erupt and change brightness<br />
** double stars in orbit around one another (but not so fast that you would see the motion) unless you come back several years later<br />
** variable stars that pulsate, and pairs of eclipsing binary stars that change their brightness periodically<br />
** stars moving slowly across the sky by comparing old and recent images<br />
** planetary nebulae surrounding dying stars<br />
** globular clusters of very old stars<br />
<br />
Beyond our galaxy there are<br />
<br />
* Extragalactic<br />
** nearby companion galaxies like the Large and Small Magellanic Clouds<br />
** clusters of galaxies like those in Virgo and Coma<br />
** active galaxies with blackholes in their nuclei<br />
** other fainter galaxies out to distances of over 100 million light years<br />
** quasars out to distances of billions of light years<br />
** the occasional new supernova in a distant galaxy<br />
<br />
However, what is available to see depends on the time of year (where Earth is in its orbit), where the planets are in their orbits, and whether you are using a telescope in the northern or southern hemisphere.<br />
<br />
<br />
<br />
== How to Proceed ==<br />
<br />
<br />
If you are in the astronomy lab on campus and working in a small group (usually 3 students), you may make a decision as a group about what to request. This is easiest for us too, since we have fewer requests to handle that way. However, if you prefer you may work on your own too.<br />
<br />
This project has two parts that are graded separately. The first part you need to do this week is to define a problem and propose an observation. After that, we will find data for you in our archive, or even acquire data with our telescopes over the following weeks. Once we have something useful, we will provide it to you and, where you need it, assist you with understanding what it has to offer. Our expectation is to have data back to you in early April, and the last activity of this semester will be to tell us what you can find in it. When the data are available there will be an announcement, probably a comment on the discussion forum, and the last activity page will open up on the class website too.<br />
<br />
* Step one: Decide What to Observe<br />
<br />
<br />
The first big step is to decide what you would like to do. Almost always, this is a very challenging exercise because there is so much to chose from. There is no right answer. Here is a guide to help you think about it. Also, before submitting your request, we welcome a discussion on line or email if you need advice.<br />
<br />
'''What would I like to know more about that I could expect to “see” with one of these telescopes?'''<br />
<br />
Our telescopes can provide data allowing you to measure how stars vary in brightness ("variable stars"), to follow the motions of satellites of Jupiter, Saturn, Uranus, and Neptune, track asteroids, and capture the latest new supernova or recently discovered comet. When the skies are dark and the Moon is not out, they can record faint nebulae and distant galaxies too. We offer views of the clouds of Jupiter and the rings and atmosphere of Saturn in better detail by selecting the very best images that are not blurred by Earth's atmosphere. Our wide field telescopes show faint nebulae and star clusters that span several degrees on the sky.<br />
<br />
Use your imagination!! Satisfy your curiosity by selecting an object or objects, and the sort of data you would like to have on it. Once you have made a decision, you have to submit your request so that we know what you want, and you have to answer questions about your request.<br />
<br />
<br />
First, check that the object of interest is visible to us now unless you want what we already have. Indeed, our best planetary images are ones we have selected from among thousands taken, so even if a planet is not favorably placed now we can probably offer something. However currently the choices in our own solar system are<br />
<br />
* Observable Solar System Objects<br />
** Venus, still visible in the evening sky and rapidly moving closer and in line with the Sun. <br />
** Jupiter, visible in the morning sky, and increasingly well placed to observe as we go into the spring season, has beautiful bands and changing detail, as well as bright satellites.<br />
** Moon, as always, spectacular in detail and visible to us almost any night with various craters, mountains and mare<br />
** An occasional comet. Ones close to the sun and bright are difficult to work with, but fainter ones with obscure names that do not make the news are almost always around<br />
** Bright asteroids. Since there are thousands we have orbits for, there are always many we can follow as they move. Of course no detail is observable with a telescope.<br />
<br />
You might use Stellarium, for example, to see what is in the sky now and later this spring, or the on-line tool [http://n-the-sky.org n-the-sky.org]. [http://www.sky-map.org/ Sky-Map] and [http://aladin.u-strasbg.fr/AladinLite/ Aladin-Lite] will let you explore images and data from professional observatories on the web. You could also use Google simply to search for more information about your proposed target. If your request is inappropriate for what we can do, we will work with you to help refine your selection.<br />
<br />
We will make the selection of the telescope and other resources for you based on what you tell us about your request. For example, if it is well below the celestial equator and seen only in the southern hemisphere we will use a telescope at Mt. Kent, and if it is better seen from the northern hemisphere, we will use one of the telescopes at Moore Observatory in Kentucky. The Moon, Jupiter and Saturn are best recorded with the special purpose planetary telescope at Moore Observatory. Objects that cover a wide area of sky may be better seen in the wide field telescope or even the color cameras, but most requests will be directed to the 20-inch "CDK" telescopes. You are welcome to request a specific telescope and if it is available and appropriate we will try to use it for your data.<br />
<br />
<br />
* '''Step 2: Register Your Request''' <br />
<br />
Complete the simple form at the link below.<br />
<br />
<br />
[https://docs.google.com/forms/d/e/1FAIpQLSftQ4tmet7KEqVbofOas7DxGRclERBHfPPZCMkXYZEs__JYKw/viewform?usp=sf_link Telescope Data Request]<br />
<br />
<br />
We use your responses to schedule our telescopes, so please complete it while you are in the lab. Make a note to yourself of your request, and keep some notes about what you responded since they will help you complete the second part later. If you are working with others we only need one on-line request. In the comments section, if you note who is working together it will simplify sorting out the requests and responses later.<br />
<br />
<br />
<br />
<br />
For this lab we need the following from each student.<br />
<br />
<br />
* '''Step 3: Respond on the Answer Sheet for This Lab."''' <br />
** Object you have selected by an identification we can use with the telescopes.<br />
** A brief statement of what that object is.<br />
** Where is it in the sky? Correct celestial coordinates are an ideal response.<br />
** Is it observable now during the night from either southern or northern hemispheres.<br />
** Which telescope do you think would be useful for this. Explain your answer, or if you do not know which telescope, tell us what would affect the choice of telescope based on your choice of object.<br />
** Provide a concise statement of what you expect to learn about your selected object from whatever data we can offer.<br />
** Your preferred email address so that we can respond to you if needed<br />
** The names of the other students working with you on the same object<br />
<br />
For this lab it will help us a lot if you put your '''preferred email address''' on the form you submit to the assistant, and list the names of those who are working with you on this. Should groups re-organize later in the semester or you find your self working with someone else, this will insure we can get data to you.<br />
<br />
After you have submitted the request, then each student should answer the questions asked here on the usual lab response form and give them to the lab assistant. He may have an immediate suggestion about your choice and you probably should discuss it with him before you get to this last step. Remember that we need both the answers from you on the usual lab sheet, and the completed web submission from your group while you are in class today.<br />
<br />
<br />
== What Happens Later ==<br />
<br />
<br />
Once your data are available we will respond by email or your class teaching assistnat will provide material for you. there will be another activity in class for you to use in completing the second part of the work. Expect it a few weeks before the end of the semester, or before if the weather is good to us. This second part is due by the end of the semester and is regarded as a separate lab activity. You should make notes on what you have asked for here today to use at that time.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Elementary_Astronomy_Laboratory_Activities&diff=2557Elementary Astronomy Laboratory Activities2019-01-01T21:14:04Z<p>WikiSysop: </p>
<hr />
<div>These activities for an Elementary Astronomy Lab were used in classes on campus at the University of Louisville in evolving forms from 1972 - 2017. This page is not currently being updated.<br />
<br />
Many of these now have mentored versions for labs that are offered [http://prancer.physics.louisville.edu/moodle on-line] as part of our new Distance Education program. More information is available on request by sending an email to ''kielkopf at louisville dot edu''.<br />
<br />
<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Identify_Constellations Identify Constellations ]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Immersive_Video_Wall About the Video Room]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Watch_the_Sky Watch the Sky (Planetarium session not currently offered)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Under_Namibian_Skies Under Namibian Skies (immersive visualization)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Night_Sky Night Sky Tonight Using Stellarium (immersive visualization)] <br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Remote_Telescope_Requests Use a Remote Telescope: Requests] and<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Remote_Telescope_Results Analyze Request Results]<br />
<br />
Travel to Mars, Jupiter, Saturn, and Uranus (immersive visualization)<br />
<br />
Survey galaxies in the universe (immersive visualization)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Survey_Galaxies_in_Virgo Survey Galaxies in Virgo]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/The_Earth_Rotates The Earth Rotates]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Our_Dynamic_Sun Our Dynamic Sun (may use the roof top solar telescope)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Light_and_Telescopes Light and Telescopes]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Experiment_with_CCD_Camera_Images Experiment with CCD Camera Images]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Use_a_CCD_Camera Use a CCD Camera]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Spectra Spectra]<br />
<br />
Observing planets and the Moon with a telescope (live remote or with the telescope on the roof)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Explore_Mars Explore Mars] (may use immersive visualiztion)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Observe_Satellites_of_Jupiter_and_Saturn Observe Satellites of Jupiter, Saturn and Uranus]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Follow_Proxima_Centauri Follow Proxima Centauri]<br />
<br />
Brightnesses and colors of stars in Messier 34<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Variable_Stars_in_Messier_3 Variable Stars in Messier 3]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Measure_a_Nearby_Supernova Measure a Nearby Supernova]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Track_Cosmic_Rays_in_a_Cloud_Chamber Track Cosmic Rays in a Cloud Chamber]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Elementary_Astronomy_Laboratory_Activities&diff=2556Elementary Astronomy Laboratory Activities2019-01-01T21:13:25Z<p>WikiSysop: </p>
<hr />
<div>These activities for an Elementary Astronomy Lab were used in classes on campus at the University of Louisville in evolving forms from 1972 - 2017. This page is not currently being updated.<br />
<br />
Many of these now have mentored versions for labs that are offered [http://prancer.physics.louisville.edu/moodle on-line] as part of our online Distance Education program. More information is available on request by sending an email to ''kielkopf at louisville dot edu''.<br />
<br />
<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Identify_Constellations Identify Constellations ]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Immersive_Video_Wall About the Video Room]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Watch_the_Sky Watch the Sky (Planetarium session not currently offered)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Under_Namibian_Skies Under Namibian Skies (immersive visualization)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Night_Sky Night Sky Tonight Using Stellarium (immersive visualization)] <br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Remote_Telescope_Requests Use a Remote Telescope: Requests] and<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Remote_Telescope_Results Analyze Request Results]<br />
<br />
Travel to Mars, Jupiter, Saturn, and Uranus (immersive visualization)<br />
<br />
Survey galaxies in the universe (immersive visualization)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Survey_Galaxies_in_Virgo Survey Galaxies in Virgo]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/The_Earth_Rotates The Earth Rotates]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Our_Dynamic_Sun Our Dynamic Sun (may use the roof top solar telescope)]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Light_and_Telescopes Light and Telescopes]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Experiment_with_CCD_Camera_Images Experiment with CCD Camera Images]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Use_a_CCD_Camera Use a CCD Camera]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Spectra Spectra]<br />
<br />
Observing planets and the Moon with a telescope (live remote or with the telescope on the roof)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Explore_Mars Explore Mars] (may use immersive visualiztion)<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Observe_Satellites_of_Jupiter_and_Saturn Observe Satellites of Jupiter, Saturn and Uranus]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Follow_Proxima_Centauri Follow Proxima Centauri]<br />
<br />
Brightnesses and colors of stars in Messier 34<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Variable_Stars_in_Messier_3 Variable Stars in Messier 3]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Measure_a_Nearby_Supernova Measure a Nearby Supernova]<br />
<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Track_Cosmic_Rays_in_a_Cloud_Chamber Track Cosmic Rays in a Cloud Chamber]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Requests&diff=2555Remote Telescope Requests2018-09-25T05:42:08Z<p>WikiSysop: </p>
<hr />
<div>== Remote Telescope Data ==<br />
<br />
''Please allow time for us to respond to your request.''<br />
''This activity should be done before midterm and the second part will be at the end of the semester.''<br />
<br />
<br />
Today’s computer technology, modern optics, and electronics allow astronomers and students to "see" the sky without enduring long freezing nights looking through an eyepiece. The data we acquire are quantitative measurements of how much light arrives from fields of view that can be tiny and cover only a few stars, or vast, and cover constellations. They can tell you the positions of moving objects, the variations in their brightness, their colors, details of the structure, and how they interact or relate with nearby companions. Modern astronomy is evolving in the development of a new class of very large telescopes that by mid-2020's will have public data over the entire sky updated almost nightly. Smaller telescopes and instruments in space follow up on the discoveries offered by these big telescopes with closer inspection and careful analysis.<br />
<br />
If you are interested in this future world, you can read more at websites for Gaia, a new satellite beginning to return amazing measurements of distances throughout our own galaxy, and of LSST, a "synoptic survey" telescope under construction in Chile.<br />
<br />
*[https://www.lsst.org/ Link to LSST at https://www.lsst.org/ ]<br />
*[http://sci.esa.int/gaia/ Link to Gaia at http://sci.esa.int/gaia/]<br />
<br />
<br />
<br />
The University of Louisville and the University of Southern Queensland operate remotely controlled telescopes at Moore Observatory near Brownsboro, Kentucky, at Mount Kent Observatory near Toowoomba, Australia, and on Mt. Lemmon in Arizona. <br />
For more than 10 years we have been acquiring data for research and student use, and we have an archive of many of the bright objects that may interest you. <br />
This semester our telescopes in Kentucky, Arizona, and Australia are operating robotically under the supervision of professional astronomers and can also acquire new images to meet your requests as well. <br />
<br />
This week your lab activity is to thoughtfully identify something in the sky that really grabs your interest. It can be anything, but because there are limitations on what we can observe we will make a few suggestions to guide your thoughts. With that, ask some questions yourself -- we are reversing roles here! Why would you like to see this object, and what would you like to learn about it that a telescope image or measurement could reveal?<br />
<br />
<br />
== Available Resources ==<br />
<br />
<br />
The observatories are described on our website at<br />
<br />
[http://sharedskies.org http://sharedskies.org]<br />
<br />
<br />
and you are encouraged to explore it to see about the observatories and their telescopes. The content is currently under development, and may change during the semester.<br />
<br />
The facilities we have access to include:<br />
<br />
* Moore Observatory in the northern hemisphere near Brownsboro, Kentucky<br />
* Mt. Kent Observatory in the southern hemisphere near Toowoomba, Australia<br />
* Mt. Lemmon Observatory in Arizona, near Tucson<br />
<br />
<br />
At both observatories we have identical 0.5 meter (20 inch) diameter "CDK20" research telescopes operating with a CCD camera and a selection of filters. These are the primary telescopes for this program. We offer other instruments too, and we are developing the capability to have you visit us through the web when the telescopes are in operation and to press the "shutter button" yourself. For example there are <br />
<br />
* A fast wide field Shared Skies Live telescopes at Moore and Mt. Kent Observatory to take images of stars and nebulae<br />
* A special long-focus telescope at Moore Observatory to take images of planets.<br />
* Color cameras to quickly image constellations, or comets that may span many degrees.<br />
* Two 0.6 meter (24 inch) "RC24" research telescope at Moore Observatory and Mt. Lemmon are used primarily to study planets around other stars.<br />
* A 0.7 meter (27 inch) "CDK700" telescope at Mt. Kent observatory that is currently observing extrasolar planets discovered with the new NASA TESS satellite.<br />
<br />
<br />
Also we have a substantial improving archive of the best images that we add to often. We will draw from it when we can to satisfy your requests.<br />
<br />
<br />
== What the Telescopes Can Show ==<br />
<br />
<br />
Except the color cameras used to record wide field images of constellations and the occasional comet, the CCD cameras on the telescopes return scientific images as digital files in “FITS” format. These images may be viewed in your computer's browser with some tools we will provide, or with other software you load on your own computer. those images require some effort to analyze, but they offer quantitative data on positions and brightnesses. Over the next few weeks you will be using some of the software that is needed and gain experience with what it can do.<br />
<br />
In many cases we will produce color images from these data that may be useful if you are interested in seeing form or structure, watching craters or shadows on the Moon, following the rotation of Saturn, Jupiter or Mars, or looking for the colors of stars. The color images often are not "true" color but a combination of different filters to highlight some aspects of the objects of interest. You can make them too by working with the original data.<br />
<br />
Except for the special wide field cameras, these telescopes cover a field less than about half a degree across, the apparent size of the Moon. Each pixel resolves about 0.5 arcsecond (there are 3600 arcseconds in a degree), and measures how much light arrived at that spot in the image for the duration of the exposure. By comparing one pixel with another, you can tell how much brighter or fainter one feature of the image is compared to another. The image that you see, while visually exciting, is also a tool to measure how much light there is and where it is.<br />
<br />
Some images may be returned to you with a calibration for the position in the sky, and as you explore them you can measure the celestial coordinates of any point in the image. In this way, you can identify individual stars, clusters, nebulae, and galaxies in the images. You can follow the changing positions of asteroids and satellites of planets. You can spot a new supernova, watch variable stars, measure the separation of double stars, measure the diameter of a distant galaxy, or follow a new comet. <br />
<br />
<br />
<br />
== Filters, Sensitivity, and Spatial Resolution ==<br />
<br />
<br />
With the scientific cameras each image is taken through a filter that isolates a narrow band of the spectrum by passing the light through a filter that removes all but the part we want to record. If you looked through one of the filters, you would see only part of the light that is collected by the telescope. A "blue" filter would should light our eyes sense as blue, while an "infrared" filter would should light our eyes cannot see at all. Here’s how we designate the filters that are usually available by the wavelengths they transmit:<br />
<br />
* Blue-Green: g' (400 to 530 nm)<br />
* Yellow-Red: r'(530 to 700 nm)<br />
* Hydrogen: H-alpha (656 nm)<br />
* Red-Near Infrared: i' (700 to 825 nm)<br />
* Infrared: z' (825 1100 nm)<br />
<br />
<br />
The numbers are the wavelength of light in the spectrum, from blue light at 450 nanometers (nm) to red light at 650 nm (nm). Infrared has a longer wavelength than red, and ultraviolet has a shorter wavelength than blue. Since our telescopes respond well to "near" infrared light, out to 1100 nm, but not well to ultraviolet light, we observe in the bands that are most efficiently detected by the optics and electronics. <br />
<br />
A measurement with different filters allows us to determine the “color” of a star, or we can put images from blue, green, and red together to make a color image that will resemble what you would see if your eyes could detect this faint light. Images through a filter that isolates light emitted by hydrogen gas would show few stars, and if you could "see" through this filter the scene would look dark red and show only the gas. You may read more about all the filters that are available here.<br />
<br />
The faintest stars you will find in most images are about 18th magnitude. These stars are more than 10,000,000 times fainter that the brightest stars in the night sky. Typically the telescoped cover a field of about 1/2 degree, the diameter of the full Moon, but in some cases it can be smaller to see very fine detail,or larger, to get an entire constellation or a big comet. If the air is steady, the smallest detail you will see is about 1 arcsecond across. For comparison, Jupiter appears about 40 arcseconds across in our sky, and the Andromeda galaxy extends thousands of arcseconds. Some of our planetary and lunar images show detail as small as 0.3 arcseconds, the resolution limit of our telescopes in perfectly stable air.<br />
<br />
<br />
== Objects == <br />
<br />
Within our solar system you could expect to see<br />
<br />
* Solar System<br />
** the occasional bright comet and many very faint ones<br />
** changing phases of the Moon, and its libration<br />
** craters on the Moon and shadows that move during the night<br />
** "earthshine", the light on the dark side of the Moon reflected from Earth<br />
** Venus, Mars, Jupiter, Saturn, Uranus and Neptune moving nightly across the sky with respect to stars<br />
** polar caps and large features on Mars<br />
** satellites of Jupiter, Saturn, Uranus and Neptune moving nightly<br />
** changing atmospheric features on Jupiter and Saturn<br />
** rings of Saturn<br />
** asteroids<br />
** the brighter dwarf planets like nearby Ceres and distant Pluto<br />
<br />
Within our Milky Way galaxy you could see<br />
<br />
* Galactic <br />
** star birth nebulae<br />
** open clusters of young stars<br />
** active stars that erupt and change brightness<br />
** double stars in orbit around one another (but not so fast that you would see the motion) unless you come back several years later<br />
** variable stars that pulsate, and pairs of eclipsing binary stars that change their brightness periodically<br />
** stars moving slowly across the sky by comparing old and recent images<br />
** planetary nebulae surrounding dying stars<br />
** globular clusters of very old stars<br />
<br />
Beyond our galaxy there are<br />
<br />
* Extragalactic<br />
** nearby companion galaxies like the Large and Small Magellanic Clouds<br />
** clusters of galaxies like those in Virgo and Coma<br />
** active galaxies with blackholes in their nuclei<br />
** other fainter galaxies out to distances of over 100 million light years<br />
** quasars out to distances of billions of light years<br />
** the occasional new supernova in a distant galaxy<br />
<br />
However, what is available to see depends on the time of year (where Earth is in its orbit), where the planets are in their orbits, and whether you are using a telescope in the northern or southern hemisphere.<br />
<br />
<br />
<br />
== How to Proceed ==<br />
<br />
<br />
If you are in the astronomy lab on campus and working in a small group (usually 3 students), you may make a decision as a group about what to request. This is easiest for us too, since we have fewer requests to handle that way. However, if you prefer you may work on your own too.<br />
<br />
This project has two parts that are graded separately. The first part you need to do this week is to define a problem and propose an observation. After that, we will find data for you in our archive, or even acquire data with our telescopes over the following weeks. Once we have something useful, we will provide it to you and, where you need it, assist you with understanding what it has to offer. Our expectation is to have data back to you in early April, and the last activity of this semester will be to tell us what you can find in it. When the data are available there will be an announcement, probably a comment on the discussion forum, and the last activity page will open up on the class website too.<br />
<br />
* Step one: Decide What to Observe<br />
<br />
<br />
The first big step is to decide what you would like to do. Almost always, this is a very challenging exercise because there is so much to chose from. There is no right answer. Here is a guide to help you think about it. Also, before submitting your request, we welcome a discussion on line or email if you need advice.<br />
<br />
'''What would I like to know more about that I could expect to “see” with one of these telescopes?'''<br />
<br />
Our telescopes can provide data allowing you to measure how stars vary in brightness ("variable stars"), to follow the motions of satellites of Jupiter, Saturn, Uranus, and Neptune, track asteroids, and capture the latest new supernova or recently discovered comet. When the skies are dark and the Moon is not out, they can record faint nebulae and distant galaxies too. We offer views of the clouds of Jupiter and the rings and atmosphere of Saturn in better detail by selecting the very best images that are not blurred by Earth's atmosphere. Our wide field telescopes show faint nebulae and star clusters that span several degrees on the sky.<br />
<br />
Use your imagination!! Satisfy your curiosity by selecting an object or objects, and the sort of data you would like to have on it. Once you have made a decision, you have to submit your request so that we know what you want, and you have to answer questions about your request.<br />
<br />
<br />
First, check that the object of interest is visible to us now unless you want what we already have. Indeed, our best planetary images are ones we have selected from among thousands taken, so even if a planet is not favorably placed now we can probably offer something. However currently the choices in our own solar system are<br />
<br />
* Observable Solar System Objects<br />
** Venus, still visible in the evening sky and rapidly moving closer and in line with the Sun. <br />
** Jupiter, visible in the morning sky, and increasingly well placed to observe as we go into the spring season, has beautiful bands and changing detail, as well as bright satellites.<br />
** Moon, as always, spectacular in detail and visible to us almost any night with various craters, mountains and mare<br />
** An occasional comet. Ones close to the sun and bright are difficult to work with, but fainter ones with obscure names that do not make the news are almost always around<br />
** Bright asteroids. Since there are thousands we have orbits for, there are always many we can follow as they move. Of course no detail is observable with a telescope.<br />
<br />
You might use Stellarium, for example, to see what is in the sky now and later this spring, or the on-line tool [http://n-the-sky.org n-the-sky.org]. [http://www.sky-map.org/ Sky-Map] and [http://aladin.u-strasbg.fr/AladinLite/ Aladin-Lite] will let you explore images and data from professional observatories on the web. You could also use Google simply to search for more information about your proposed target. If your request is inappropriate for what we can do, we will work with you to help refine your selection.<br />
<br />
We will make the selection of the telescope and other resources for you based on what you tell us about your request. For example, if it is well below the celestial equator and seen only in the southern hemisphere we will use a telescope at Mt. Kent, and if it is better seen from the northern hemisphere, we will use one of the telescopes at Moore Observatory in Kentucky. The Moon, Jupiter and Saturn are best recorded with the special purpose planetary telescope at Moore Observatory. Objects that cover a wide area of sky may be better seen in the wide field telescope or even the color cameras, but most requests will be directed to the 20-inch "CDK" telescopes. You are welcome to request a specific telescope and if it is available and appropriate we will try to use it for your data.<br />
<br />
<br />
* '''Step 2: Register Your Request''' <br />
<br />
Complete the simple form at the link below.<br />
<br />
<br />
[https://docs.google.com/forms/d/e/1FAIpQLSftQ4tmet7KEqVbofOas7DxGRclERBHfPPZCMkXYZEs__JYKw/viewform?usp=sf_link Telescope Data Request]<br />
<br />
<br />
We use your responses to schedule our telescopes, so please complete it while you are in the lab. Make a note to yourself of your request, and keep some notes about what you responded since they will help you complete the second part later. If you are working with others we only need one on-line request. In the comments section, if you note who is working together it will simplify sorting out the requests and responses later.<br />
<br />
<br />
<br />
<br />
For this lab we need the following from each student.<br />
<br />
<br />
* '''Step 3: Respond on the Answer Sheet for This Lab."''' <br />
** Object you have selected by an identification we can use with the telescopes.<br />
** A brief statement of what that object is.<br />
** Where is it in the sky? Correct celestial coordinates are an ideal response.<br />
** Is it observable now during the night from either southern or northern hemispheres.<br />
** Which telescope do you think would be useful for this. Explain your answer, or if you do not know which telescope, tell us what would affect the choice of telescope based on your choice of object.<br />
** Provide a concise statement of what you expect to learn about your selected object from whatever data we can offer.<br />
** Your preferred email address so that we can respond to you if needed<br />
** The names of the other students working with you on the same object<br />
<br />
For this lab it will help us a lot if you put your '''preferred email address''' on the form you submit to the assistant, and list the names of those who are working with you on this. Should groups re-organize later in the semester or you find your self working with someone else, this will insure we can get data to you.<br />
<br />
After you have submitted the request, then each student should answer the questions asked here on the usual lab response form and give them to the lab assistant. He may have an immediate suggestion about your choice and you probably should discuss it with him before you get to this last step. Remember that we need both the answers from you on the usual lab sheet, and the completed web submission from your group while you are in class today.<br />
<br />
<br />
== What Happens Later ==<br />
<br />
<br />
Once your data are available we will respond by email or your class teaching assistnat will provide material for you. there will be another activity in class for you to use in completing the second part of the work. Expect it a few weeks before the end of the semester, or before if the weather is good to us. This second part is due by the end of the semester and is regarded as a separate lab activity. You should make notes on what you have asked for here today to use at that time.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Requests&diff=2554Remote Telescope Requests2018-09-25T05:23:26Z<p>WikiSysop: </p>
<hr />
<div>== Remote Telescope Data ==<br />
<br />
''Please allow time for us to respond to your request.''<br />
''This activity should be done before midterm and the second part will be at the end of the semester.''<br />
<br />
<br />
Today’s computer technology, modern optics, and electronics allow astronomers and students to "see" the sky without enduring long freezing nights looking through an eyepiece. The data we acquire are quantitative measurements of how much light arrives from fields of view that can be tiny and cover only a few stars, or vast, and cover constellations. They can tell you the positions of moving objects, the variations in their brightness, their colors, details of the structure, and how they interact or relate with nearby companions. Modern astronomy is evolving in the development of a new class of very large telescopes that by mid-2020's will have public data over the entire sky updated almost nightly. Smaller telescopes and instruments in space follow up on the discoveries offered by these big telescopes with closer inspection and careful analysis.<br />
<br />
If you are interested in this future world, you can read more at websites for Gaia, a new satellite beginning to return amazing measurements of distances throughout our own galaxy, and of LSST, a "synoptic survey" telescope under construction in Chile.<br />
<br />
*[https://www.lsst.org/ Link to LSST at https://www.lsst.org/ ]<br />
*[http://sci.esa.int/gaia/ Link to Gaia at http://sci.esa.int/gaia/]<br />
<br />
<br />
<br />
The University of Louisville and the University of Southern Queensland operate remotely controlled telescopes at Moore Observatory near Brownsboro, Kentucky, at Mount Kent Observatory near Toowoomba, Australia, and on Mt. Lemmon in Arizona. <br />
For more than 10 years we have been acquiring data for research and student use, and we have an archive of many of the bright objects that may interest you. <br />
This semester our telescopes in Kentucky, Arizona, and Australia are operating robotically under the supervision of professional astronomers and can also acquire new images to meet your requests as well. <br />
<br />
This week your lab activity is to thoughtfully identify something in the sky that really grabs your interest. It can be anything, but because there are limitations on what we can observe we will make a few suggestions to guide your thoughts. With that, ask some questions yourself -- we are reversing roles here! Why would you like to see this object, and what would you like to learn about it that a telescope image or measurement could reveal?<br />
<br />
<br />
== Available Resources ==<br />
<br />
<br />
The observatories are described on our website at<br />
<br />
[http://sharedskies.org http://sharedskies.org]<br />
<br />
<br />
and you are encouraged to explore it to see about the observatories and their telescopes. The content is currently under development, and may change during the semester.<br />
<br />
The facilities we have access to include:<br />
<br />
* Moore Observatory in the northern hemisphere near Brownsboro, Kentucky<br />
* Mt. Kent Observatory in the southern hemisphere near Toowoomba, Australia<br />
* Mt. Lemmon Observatory in Arizona, near Tucson<br />
<br />
<br />
At both observatories we have identical 0.5 meter (20 inch) diameter "CDK20" research telescopes operating with a CCD camera and a selection of filters. These are the primary telescopes for this program. We offer other instruments too, and we are developing the capability to have you visit us through the web when the telescopes are in operation and to press the "shutter button" yourself. For example there are <br />
<br />
* A fast wide field Shared Skies Live telescope at Moore Observatory to take images of stars and nebulae through special filters.<br />
* A special long-focus telescope at Moore Observatory to take images of planets<br />
* Color cameras to quickly image star patterns, constellations, or comets that may span many degrees.<br />
* Two 0.6 meter (24 inch) "RC24" research telescope at Moore Observatory and Mt. Lemmon are used primarily to study planets around other stars<br />
* A 0.7 meter (27 inch) "CDK700" telescope at Mt. Kent observatory that will be used for spectroscopy, the analysis of starlight, for assisting the new NASA TESS satellite that is to launch soon.<br />
<br />
<br />
Also we have a substantial improving archive of the best images that we add to often. We will draw from it when we can to satisfy your requests.<br />
<br />
<br />
== What the Telescopes Can Show ==<br />
<br />
<br />
Except the color cameras used to record wide field images of constellations and the occasional comet, the CCD cameras on the telescopes return scientific images as digital files in “FITS” format. These images may be viewed in your computer's browser with some tools we will provide, or with other software you load on your own computer. those images require some effort to analyze, but they offer quantitative data on positions and brightnesses. Over the next few weeks you will be using some of the software that is needed and gain experience with what it can do.<br />
<br />
In many cases we will produce color images from these data that may be useful if you are interested in seeing form or structure, watching craters or shadows on the Moon, following the rotation of Saturn, Jupiter or Mars, or looking for the colors of stars. The color images often are not "true" color but a combination of different filters to highlight some aspects of the objects of interest. You can make them too by working with the original data.<br />
<br />
Except for the special wide field cameras, these telescopes cover a field less than about half a degree across, the apparent size of the Moon. Each pixel resolves about 0.5 arcsecond (there are 3600 arcseconds in a degree), and measures how much light arrived at that spot in the image for the duration of the exposure. By comparing one pixel with another, you can tell how much brighter or fainter one feature of the image is compared to another. The image that you see, while visually exciting, is also a tool to measure how much light there is and where it is.<br />
<br />
Some images may be returned to you with a calibration for the position in the sky, and as you explore them you can measure the celestial coordinates of any point in the image. In this way, you can identify individual stars, clusters, nebulae, and galaxies in the images. You can follow the changing positions of asteroids and satellites of planets. You can spot a new supernova, watch variable stars, measure the separation of double stars, measure the diameter of a distant galaxy, or follow a new comet. <br />
<br />
<br />
<br />
== Filters, Sensitivity, and Spatial Resolution ==<br />
<br />
<br />
With the scientific cameras each image is taken through a filter that isolates a narrow band of the spectrum by passing the light through a filter that removes all but the part we want to record. If you looked through one of the filters, you would see only part of the light that is collected by the telescope. A "blue" filter would should light our eyes sense as blue, while an "infrared" filter would should light our eyes cannot see at all. Here’s how we designate the filters that are usually available by the wavelengths they transmit:<br />
<br />
* Blue-Green: g' (400 to 530 nm)<br />
* Yellow-Red: r'(530 to 700 nm)<br />
* Hydrogen: H-alpha (656 nm)<br />
* Red-Near Infrared: i' (700 to 825 nm)<br />
* Infrared: z' (825 1100 nm)<br />
<br />
<br />
The numbers are the wavelength of light in the spectrum, from blue light at 450 nanometers (nm) to red light at 650 nm (nm). Infrared has a longer wavelength than red, and ultraviolet has a shorter wavelength than blue. Since our telescopes respond well to "near" infrared light, out to 1100 nm, but not well to ultraviolet light, we observe in the bands that are most efficiently detected by the optics and electronics. <br />
<br />
A measurement with different filters allows us to determine the “color” of a star, or we can put images from blue, green, and red together to make a color image that will resemble what you would see if your eyes could detect this faint light. Images through a filter that isolates light emitted by hydrogen gas would show few stars, and if you could "see" through this filter the scene would look dark red and show only the gas. You may read more about all the filters that are available here.<br />
<br />
The faintest stars you will find in most images are about 18th magnitude. These stars are more than 10,000,000 times fainter that the brightest stars in the night sky. Typically the telescoped cover a field of about 1/2 degree, the diameter of the full Moon, but in some cases it can be smaller to see very fine detail,or larger, to get an entire constellation or a big comet. If the air is steady, the smallest detail you will see is about 1 arcsecond across. For comparison, Jupiter appears about 40 arcseconds across in our sky, and the Andromeda galaxy extends thousands of arcseconds. Some of our planetary and lunar images show detail as small as 0.3 arcseconds, the resolution limit of our telescopes in perfectly stable air.<br />
<br />
<br />
== Objects == <br />
<br />
Within our solar system you could expect to see<br />
<br />
* Solar System<br />
** the occasional bright comet and many very faint ones<br />
** changing phases of the Moon, and its libration<br />
** craters on the Moon and shadows that move during the night<br />
** "earthshine", the light on the dark side of the Moon reflected from Earth<br />
** Venus, Mars, Jupiter, Saturn, Uranus and Neptune moving nightly across the sky with respect to stars<br />
** polar caps and large features on Mars<br />
** satellites of Jupiter, Saturn, Uranus and Neptune moving nightly<br />
** changing atmospheric features on Jupiter and Saturn<br />
** rings of Saturn<br />
** asteroids<br />
** the brighter dwarf planets like nearby Ceres and distant Pluto<br />
<br />
Within our Milky Way galaxy you could see<br />
<br />
* Galactic <br />
** star birth nebulae<br />
** open clusters of young stars<br />
** active stars that erupt and change brightness<br />
** double stars in orbit around one another (but not so fast that you would see the motion) unless you come back several years later<br />
** variable stars that pulsate, and pairs of eclipsing binary stars that change their brightness periodically<br />
** stars moving slowly across the sky by comparing old and recent images<br />
** planetary nebulae surrounding dying stars<br />
** globular clusters of very old stars<br />
<br />
Beyond our galaxy there are<br />
<br />
* Extragalactic<br />
** nearby companion galaxies like the Large and Small Magellanic Clouds<br />
** clusters of galaxies like those in Virgo and Coma<br />
** active galaxies with blackholes in their nuclei<br />
** other fainter galaxies out to distances of over 100 million light years<br />
** quasars out to distances of billions of light years<br />
** the occasional new supernova in a distant galaxy<br />
<br />
However, what is available to see depends on the time of year (where Earth is in its orbit), where the planets are in their orbits, and whether you are using a telescope in the northern or southern hemisphere.<br />
<br />
<br />
<br />
== How to Proceed ==<br />
<br />
<br />
If you are in the astronomy lab on campus and working in a small group (usually 3 students), you may make a decision as a group about what to request. This is easiest for us too, since we have fewer requests to handle that way. However, if you prefer you may work on your own too.<br />
<br />
This project has two parts that are graded separately. The first part you need to do this week is to define a problem and propose an observation. After that, we will find data for you in our archive, or even acquire data with our telescopes over the following weeks. Once we have something useful, we will provide it to you and, where you need it, assist you with understanding what it has to offer. Our expectation is to have data back to you in early April, and the last activity of this semester will be to tell us what you can find in it. When the data are available there will be an announcement, probably a comment on the discussion forum, and the last activity page will open up on the class website too.<br />
<br />
* Step one: Decide What to Observe<br />
<br />
<br />
The first big step is to decide what you would like to do. Almost always, this is a very challenging exercise because there is so much to chose from. There is no right answer. Here is a guide to help you think about it. Also, before submitting your request, we welcome a discussion on line or email if you need advice.<br />
<br />
'''What would I like to know more about that I could expect to “see” with one of these telescopes?'''<br />
<br />
Our telescopes can provide data allowing you to measure how stars vary in brightness ("variable stars"), to follow the motions of satellites of Jupiter, Saturn, Uranus, and Neptune, track asteroids, and capture the latest new supernova or recently discovered comet. When the skies are dark and the Moon is not out, they can record faint nebulae and distant galaxies too. We offer views of the clouds of Jupiter and the rings and atmosphere of Saturn in better detail by selecting the very best images that are not blurred by Earth's atmosphere. Our wide field telescopes show faint nebulae and star clusters that span several degrees on the sky.<br />
<br />
Use your imagination!! Satisfy your curiosity by selecting an object or objects, and the sort of data you would like to have on it. Once you have made a decision, you have to submit your request so that we know what you want, and you have to answer questions about your request.<br />
<br />
<br />
First, check that the object of interest is visible to us now unless you want what we already have. Indeed, our best planetary images are ones we have selected from among thousands taken, so even if a planet is not favorably placed now we can probably offer something. However currently the choices in our own solar system are<br />
<br />
* Observable Solar System Objects<br />
** Venus, still visible in the evening sky and rapidly moving closer and in line with the Sun. <br />
** Jupiter, visible in the morning sky, and increasingly well placed to observe as we go into the spring season, has beautiful bands and changing detail, as well as bright satellites.<br />
** Moon, as always, spectacular in detail and visible to us almost any night with various craters, mountains and mare<br />
** An occasional comet. Ones close to the sun and bright are difficult to work with, but fainter ones with obscure names that do not make the news are almost always around<br />
** Bright asteroids. Since there are thousands we have orbits for, there are always many we can follow as they move. Of course no detail is observable with a telescope.<br />
<br />
You might use Stellarium, for example, to see what is in the sky now and later this spring, or the on-line tool [http://n-the-sky.org n-the-sky.org]. [http://www.sky-map.org/ Sky-Map] and [http://aladin.u-strasbg.fr/AladinLite/ Aladin-Lite] will let you explore images and data from professional observatories on the web. You could also use Google simply to search for more information about your proposed target. If your request is inappropriate for what we can do, we will work with you to help refine your selection.<br />
<br />
We will make the selection of the telescope and other resources for you based on what you tell us about your request. For example, if it is well below the celestial equator and seen only in the southern hemisphere we will use a telescope at Mt. Kent, and if it is better seen from the northern hemisphere, we will use one of the telescopes at Moore Observatory in Kentucky. The Moon, Jupiter and Saturn are best recorded with the special purpose planetary telescope at Moore Observatory. Objects that cover a wide area of sky may be better seen in the wide field telescope or even the color cameras, but most requests will be directed to the 20-inch "CDK" telescopes. You are welcome to request a specific telescope and if it is available and appropriate we will try to use it for your data.<br />
<br />
<br />
* '''Step 2: Register Your Request''' <br />
<br />
Complete the simple form at the link below.<br />
<br />
<br />
[https://docs.google.com/forms/d/e/1FAIpQLSftQ4tmet7KEqVbofOas7DxGRclERBHfPPZCMkXYZEs__JYKw/viewform?usp=sf_link Telescope Data Request]<br />
<br />
<br />
We use your responses to schedule our telescopes, so please complete it while you are in the lab. Make a note to yourself of your request, and keep some notes about what you responded since they will help you complete the second part later. If you are working with others we only need one on-line request. In the comments section, if you note who is working together it will simplify sorting out the requests and responses later.<br />
<br />
<br />
<br />
<br />
For this lab we need the following from each student.<br />
<br />
<br />
* '''Step 3: Respond on the Answer Sheet for This Lab."''' <br />
** Object you have selected by an identification we can use with the telescopes.<br />
** A brief statement of what that object is.<br />
** Where is it in the sky? Correct celestial coordinates are an ideal response.<br />
** Is it observable now during the night from either southern or northern hemispheres.<br />
** Which telescope do you think would be useful for this. Explain your answer, or if you do not know which telescope, tell us what would affect the choice of telescope based on your choice of object.<br />
** Provide a concise statement of what you expect to learn about your selected object from whatever data we can offer.<br />
** Your preferred email address so that we can respond to you if needed<br />
** The names of the other students working with you on the same object<br />
<br />
For this lab it will help us a lot if you put your '''preferred email address''' on the form you submit to the assistant, and list the names of those who are working with you on this. Should groups re-organize later in the semester or you find your self working with someone else, this will insure we can get data to you.<br />
<br />
After you have submitted the request, then each student should answer the questions asked here on the usual lab response form and give them to the lab assistant. He may have an immediate suggestion about your choice and you probably should discuss it with him before you get to this last step. Remember that we need both the answers from you on the usual lab sheet, and the completed web submission from your group while you are in class today.<br />
<br />
<br />
== What Happens Later ==<br />
<br />
<br />
Once your data are available we will provide a link from which you can download images and there will be another activity in class for you to use in completing the work. Expect it a few weeks before the end of the semester, or before if the weather is good to us.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Requests&diff=2553Remote Telescope Requests2018-09-25T05:21:10Z<p>WikiSysop: </p>
<hr />
<div>== Remote Telescope Data ==<br />
<br />
''Please allow time for us to respond to your request.''<br />
''This activity should be done before midterm and the second part will be at the end of the semester.''<br />
<br />
<br />
Today’s computer technology, modern optics, and electronics allow astronomers and students to "see" the sky without enduring long freezing nights looking through an eyepiece. The data we acquire are quantitative measurements of how much light arrives from fields of view that can be tiny and cover only a few stars, or vast, and cover constellations. They can tell you the positions of moving objects, the variations in their brightness, their colors, details of the structure, and how they interact or relate with nearby companions. Modern astronomy is evolving in the development of a new class of very large telescopes that by mid-2020's will have public data over the entire sky updated almost nightly. Smaller telescopes and instruments in space follow up on the discoveries offered by these big telescopes with closer inspection and careful analysis.<br />
<br />
If you are interested in this future world, you can read more at websites for Gaia, a new satellite beginning to return amazing measurements of distances throughout our own galaxy, and of LSST, a "synoptic survey" telescope under construction in Chile.<br />
<br />
*[https://www.lsst.org/ Link to LSST at https://www.lsst.org/ ]<br />
*[http://sci.esa.int/gaia/ Link to Gaia at http://sci.esa.int/gaia/]<br />
<br />
<br />
<br />
The University of Louisville and the University of Southern Queensland operate remotely controlled telescopes at Moore Observatory near Brownsboro, Kentucky, at Mount Kent Observatory near Toowoomba, Australia, and on Mt. Lemmon in Arizona. <br />
For more than 10 years we have been acquiring data for research and student use, and we have an archive of many of the bright objects that may interest you. <br />
This semester our telescopes in Kentucky, Arizona, and Australia are operating robotically under the supervision of professional astronomers and can also acquire new images to meet your requests as well. <br />
<br />
This week your lab activity is to thoughtfully identify something in the sky that really grabs your interest. It can be anything, but because there are limitations on what we can observe we will make a few suggestions to guide your thoughts. With that, ask some questions yourself -- we are reversing roles here! Why would you like to see this object, and what would you like to learn about it that a telescope image or measurement could reveal?<br />
<br />
<br />
== Available Resources ==<br />
<br />
<br />
The observatories are described on our website at<br />
<br />
[http://sharedskies.org http://sharedskies.org]<br />
<br />
<br />
and you are encouraged to explore it to see about the observatories and their telescopes. The content is currently under development, and may change during the semester.<br />
<br />
The facilities we have access to include:<br />
<br />
* Moore Observatory in the northern hemisphere near Brownsboro, Kentucky<br />
* Mt. Kent Observatory in the southern hemisphere near Toowoomba, Australia<br />
* Mt. Lemmon Observatory in Arizona, near Tucson<br />
<br />
<br />
At both observatories we have identical 0.5 meter (20 inch) diameter "CDK20" research telescopes operating with a CCD camera and a selection of filters. These are the primary telescopes for this program. We offer other instruments too, and we are developing the capability to have you visit us through the web when the telescopes are in operation and to press the "shutter button" yourself. For example there are <br />
<br />
* A fast wide field Shared Skies Live telescope at Moore Observatory to take images of stars and nebulae through special filters.<br />
* A special long-focus telescope at Moore Observatory to take images of planets<br />
* Color cameras to quickly image star patterns, constellations, or comets that may span many degrees.<br />
* Two 0.6 meter (24 inch) "RC24" research telescope at Moore Observatory and Mt. Lemmon are used primarily to study planets around other stars<br />
* A 0.7 meter (27 inch) "CDK700" telescope at Mt. Kent observatory that will be used for spectroscopy, the analysis of starlight, for assisting the new NASA TESS satellite that is to launch soon.<br />
<br />
<br />
Also we have a substantial improving archive of the best images that we add to often. We will draw from it when we can to satisfy your requests.<br />
<br />
<br />
== What the Telescopes Can Show ==<br />
<br />
<br />
Except the color cameras used to record wide field images of constellations and the occasional comet, the CCD cameras on the telescopes return scientific images as digital files in “FITS” format. These images may be viewed in your computer's browser with some tools we will provide, or with other software you load on your own computer. those images require some effort to analyze, but they offer quantitative data on positions and brightnesses. Over the next few weeks you will be using some of the software that is needed and gain experience with what it can do.<br />
<br />
In many cases we will produce color images from these data that may be useful if you are interested in seeing form or structure, watching craters or shadows on the Moon, following the rotation of Saturn, Jupiter or Mars, or looking for the colors of stars. The color images often are not "true" color but a combination of different filters to highlight some aspects of the objects of interest. You can make them too by working with the original data.<br />
<br />
Except for the special wide field cameras, these telescopes cover a field less than about half a degree across, the apparent size of the Moon. Each pixel resolves about 0.5 arcsecond (there are 3600 arcseconds in a degree), and measures how much light arrived at that spot in the image for the duration of the exposure. By comparing one pixel with another, you can tell how much brighter or fainter one feature of the image is compared to another. The image that you see, while visually exciting, is also a tool to measure how much light there is and where it is.<br />
<br />
Some images may be returned to you with a calibration for the position in the sky, and as you explore them you can measure the celestial coordinates of any point in the image. In this way, you can identify individual stars, clusters, nebulae, and galaxies in the images. You can follow the changing positions of asteroids and satellites of planets. You can spot a new supernova, watch variable stars, measure the separation of double stars, measure the diameter of a distant galaxy, or follow a new comet. <br />
<br />
<br />
<br />
== Filters, Sensitivity, and Spatial Resolution ==<br />
<br />
<br />
With the scientific cameras each image is taken through a filter that isolates a narrow band of the spectrum by passing the light through a filter that removes all but the part we want to record. If you looked through one of the filters, you would see only part of the light that is collected by the telescope. A "blue" filter would should light our eyes sense as blue, while an "infrared" filter would should light our eyes cannot see at all. Here’s how we designate the filters that are usually available by the wavelengths they transmit:<br />
<br />
* Blue-Green: g' (400 to 530 nm)<br />
* Yellow-Red: r'(530 to 700 nm)<br />
* Hydrogen: H-alpha (656 nm)<br />
* Red-Near Infrared: i' (700 to 825 nm)<br />
* Infrared: z' (825 1100 nm)<br />
<br />
<br />
The numbers are the wavelength of light in the spectrum, from blue light at 450 nanometers (nm) to red light at 650 nm (nm). Infrared has a longer wavelength than red, and ultraviolet has a shorter wavelength than blue. Since our telescopes respond well to "near" infrared light, out to 1100 nm, but not well to ultraviolet light, we observe in the bands that are most efficiently detected by the optics and electronics. <br />
<br />
A measurement with different filters allows us to determine the “color” of a star, or we can put images from blue, green, and red together to make a color image that will resemble what you would see if your eyes could detect this faint light. Images through a filter that isolates light emitted by hydrogen gas would show few stars, and if you could "see" through this filter the scene would look dark red and show only the gas. You may read more about all the filters that are available here.<br />
<br />
The faintest stars you will find in most images are about 18th magnitude. These stars are more than 10,000,000 times fainter that the brightest stars in the night sky. Typically the telescoped cover a field of about 1/2 degree, the diameter of the full Moon, but in some cases it can be smaller to see very fine detail,or larger, to get an entire constellation or a big comet. If the air is steady, the smallest detail you will see is about 1 arcsecond across. For comparison, Jupiter appears about 40 arcseconds across in our sky, and the Andromeda galaxy extends thousands of arcseconds. Some of our planetary and lunar images show detail as small as 0.3 arcseconds, the resolution limit of our telescopes in perfectly stable air.<br />
<br />
<br />
== Objects == <br />
<br />
Within our solar system you could expect to see<br />
<br />
* Solar System<br />
** the occasional bright comet and many very faint ones<br />
** changing phases of the Moon, and its libration<br />
** craters on the Moon and shadows that move during the night<br />
** "earthshine", the light on the dark side of the Moon reflected from Earth<br />
** Venus, Mars, Jupiter, Saturn, Uranus and Neptune moving nightly across the sky with respect to stars<br />
** polar caps and large features on Mars<br />
** satellites of Jupiter, Saturn, Uranus and Neptune moving nightly<br />
** changing atmospheric features on Jupiter and Saturn<br />
** rings of Saturn<br />
** asteroids<br />
** the brighter dwarf planets like nearby Ceres and distant Pluto<br />
<br />
Within our Milky Way galaxy you could see<br />
<br />
* Galactic <br />
** star birth nebulae<br />
** open clusters of young stars<br />
** active stars that erupt and change brightness<br />
** double stars in orbit around one another (but not so fast that you would see the motion) unless you come back several years later<br />
** variable stars that pulsate, and pairs of eclipsing binary stars that change their brightness periodically<br />
** stars moving slowly across the sky by comparing old and recent images<br />
** planetary nebulae surrounding dying stars<br />
** globular clusters of very old stars<br />
<br />
Beyond our galaxy there are<br />
<br />
* Extragalactic<br />
** nearby companion galaxies like the Large and Small Magellanic Clouds<br />
** clusters of galaxies like those in Virgo and Coma<br />
** active galaxies with blackholes in their nuclei<br />
** other fainter galaxies out to distances of over 100 million light years<br />
** quasars out to distances of billions of light years<br />
** the occasional new supernova in a distant galaxy<br />
<br />
However, what is available to see depends on the time of year (where Earth is in its orbit), where the planets are in their orbits, and whether you are using a telescope in the northern or southern hemisphere.<br />
<br />
<br />
<br />
== How to Proceed ==<br />
<br />
<br />
If you are in the astronomy lab on campus and working in a small group (usually 3 students), you may make a decision as a group about what to request. This is easiest for us too, since we have fewer requests to handle that way. However, if you prefer you may work on your own too.<br />
<br />
This project has two parts that are graded separately. The first part you need to do this week is to define a problem and propose an observation. After that, we will find data for you in our archive, or even acquire data with our telescopes over the following weeks. Once we have something useful, we will provide it to you and, where you need it, assist you with understanding what it has to offer. Our expectation is to have data back to you in early April, and the last activity of this semester will be to tell us what you can find in it. When the data are available there will be an announcement, probably a comment on the discussion forum, and the last activity page will open up on the class website too.<br />
<br />
* Step one: Decide What to Observe<br />
<br />
<br />
The first big step is to decide what you would like to do. Almost always, this is a very challenging exercise because there is so much to chose from. There is no right answer. Here is a guide to help you think about it. Also, before submitting your request, we welcome a discussion on line or email if you need advice.<br />
<br />
'''What would I like to know more about that I could expect to “see” with one of these telescopes?'''<br />
<br />
Our telescopes can provide data allowing you to measure how stars vary in brightness ("variable stars"), to follow the motions of satellites of Jupiter, Saturn, Uranus, and Neptune, track asteroids, and capture the latest new supernova or recently discovered comet. When the skies are dark and the Moon is not out, they can record faint nebulae and distant galaxies too. We offer views of the clouds of Jupiter and the rings and atmosphere of Saturn in better detail by selecting the very best images that are not blurred by Earth's atmosphere. Our wide field telescopes show faint nebulae and star clusters that span several degrees on the sky.<br />
<br />
Use your imagination!! Satisfy your curiosity by selecting an object or objects, and the sort of data you would like to have on it. Once you have made a decision, you have to submit your request so that we know what you want, and you have to answer questions about your request.<br />
<br />
<br />
First, check that the object of interest is visible to us now unless you want what we already have. Indeed, our best planetary images are ones we have selected from among thousands taken, so even if a planet is not favorably placed now we can probably offer something. However currently the choices in our own solar system are<br />
<br />
* Observable Solar System Objects<br />
** Venus, still visible in the evening sky and rapidly moving closer and in line with the Sun. <br />
** Jupiter, visible in the morning sky, and increasingly well placed to observe as we go into the spring season, has beautiful bands and changing detail, as well as bright satellites.<br />
** Moon, as always, spectacular in detail and visible to us almost any night with various craters, mountains and mare<br />
** An occasional comet. Ones close to the sun and bright are difficult to work with, but fainter ones with obscure names that do not make the news are almost always around<br />
** Bright asteroids. Since there are thousands we have orbits for, there are always many we can follow as they move. Of course no detail is observable with a telescope.<br />
<br />
You might use Stellarium, for example, to see what is in the sky now and later this spring, or the on-line tool [http://n-the-sky.org n-the-sky.org]. [http://www.sky-map.org/ Sky-Map] and [http://aladin.u-strasbg.fr/AladinLite/ Aladin-Lite] will let you explore images and data from professional observatories on the web. You could also use Google simply to search for more information about your proposed target. If your request is inappropriate for what we can do, we will work with you to help refine your selection.<br />
<br />
We will make the selection of the telescope and other resources for you based on what you tell us about your request. For example, if it is well below the celestial equator and seen only in the southern hemisphere we will use a telescope at Mt. Kent, and if it is better seen from the northern hemisphere, we will use one of the telescopes at Moore Observatory in Kentucky. The Moon, Jupiter and Saturn are best recorded with the special purpose planetary telescope at Moore Observatory. Objects that cover a wide area of sky may be better seen in the wide field telescope or even the color cameras, but most requests will be directed to the 20-inch "CDK" telescopes. You are welcome to request a specific telescope and if it is available and appropriate we will try to use it for your data.<br />
<br />
<br />
* '''Step 2: Register Your Request''' <br />
<br />
Complete the simple form at the link below.<br />
<br />
[https://docs.google.com/forms/d/e/1FAIpQLSftQ4tmet7KEqVbofOas7DxGRclERBHfPPZCMkXYZEs__JYKw/viewform?usp=sf_link Telescope Data Request Form] at https://docs.google.com/forms/d/e/1FAIpQLSftQ4tmet7KEqVbofOas7DxGRclERBHfPPZCMkXYZEs__JYKw/viewform?usp=sf_link Google Forms[]<br />
<br />
We use your responses to schedule our telescopes, so please complete it while you are in the lab. Make a note to yourself of your request, and keep some notes about what you responded since they will help you complete the second part later. If you are working with others we only need one on-line request. In the comments section, if you note who is working together it will simplify sorting out the requests and responses later.<br />
<br />
<br />
<br />
<br />
For this lab we need the following from each student.<br />
<br />
<br />
* '''Step 3: Respond on the Answer Sheet for This Lab."''' <br />
** Object you have selected by an identification we can use with the telescopes.<br />
** A brief statement of what that object is.<br />
** Where is it in the sky? Correct celestial coordinates are an ideal response.<br />
** Is it observable now during the night from either southern or northern hemispheres.<br />
** Which telescope do you think would be useful for this. Explain your answer, or if you do not know which telescope, tell us what would affect the choice of telescope based on your choice of object.<br />
** Provide a concise statement of what you expect to learn about your selected object from whatever data we can offer.<br />
** Your preferred email address so that we can respond to you if needed<br />
** The names of the other students working with you on the same object<br />
<br />
For this lab it will help us a lot if you put your '''preferred email address''' on the form you submit to the assistant, and list the names of those who are working with you on this. Should groups re-organize later in the semester or you find your self working with someone else, this will insure we can get data to you.<br />
<br />
After you have submitted the request, then each student should answer the questions asked here on the usual lab response form and give them to the lab assistant. He may have an immediate suggestion about your choice and you probably should discuss it with him before you get to this last step. Remember that we need both the answers from you on the usual lab sheet, and the completed web submission from your group while you are in class today.<br />
<br />
<br />
== What Happens Later ==<br />
<br />
<br />
Once your data are available we will provide a link from which you can download images and there will be another activity in class for you to use in completing the work. Expect it a few weeks before the end of the semester, or before if the weather is good to us.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Remote_Telescope_Requests&diff=2552Remote Telescope Requests2018-08-27T17:43:57Z<p>WikiSysop: </p>
<hr />
<div>== Remote Telescope Data ==<br />
<br />
''Please allow time for us to respond to your request.''<br />
''This activity should be done before midterm and the second part will be at the end of the semester.''<br />
<br />
<br />
Today’s computer technology, modern optics, and electronics allow astronomers and students to "see" the sky without enduring long freezing nights looking through an eyepiece. The data we acquire are quantitative measurements of how much light arrives from fields of view that can be tiny and cover only a few stars, or vast, and cover constellations. They can tell you the positions of moving objects, the variations in their brightness, their colors, details of the structure, and how they interact or relate with nearby companions. Modern astronomy is evolving in the development of a new class of very large telescopes that by mid-2020's will have public data over the entire sky updated almost nightly. Smaller telescopes and instruments in space follow up on the discoveries offered by these big telescopes with closer inspection and careful analysis.<br />
<br />
If you are interested in this future world, you can read more at websites for Gaia, a new satellite beginning to return amazing measurements of distances throughout our own galaxy, and of LSST, a "synoptic survey" telescope under construction in Chile.<br />
<br />
*[https://www.lsst.org/ Link to LSST at https://www.lsst.org/ ]<br />
*[http://sci.esa.int/gaia/ Link to Gaia at http://sci.esa.int/gaia/]<br />
<br />
<br />
<br />
The University of Louisville and the University of Southern Queensland operate remotely controlled telescopes at Moore Observatory near Brownsboro, Kentucky, at Mount Kent Observatory near Toowoomba, Australia, and on Mt. Lemmon in Arizona. <br />
For more than 10 years we have been acquiring data for research and student use, and we have an archive of many of the bright objects that may interest you. <br />
This semester our telescopes in Kentucky, Arizona, and Australia are operating robotically under the supervision of professional astronomers and can also acquire new images to meet your requests as well. <br />
<br />
This week your lab activity is to thoughtfully identify something in the sky that really grabs your interest. It can be anything, but because there are limitations on what we can observe we will make a few suggestions to guide your thoughts. With that, ask some questions yourself -- we are reversing roles here! Why would you like to see this object, and what would you like to learn about it that a telescope image or measurement could reveal?<br />
<br />
<br />
== Available Resources ==<br />
<br />
<br />
The observatories are described on our website at<br />
<br />
[http://sharedskies.org http://sharedskies.org]<br />
<br />
<br />
and you are encouraged to explore it to see about the observatories and their telescopes. The content is currently under development, and may change during the semester.<br />
<br />
The facilities we have access to include:<br />
<br />
* Moore Observatory in the northern hemisphere near Brownsboro, Kentucky<br />
* Mt. Kent Observatory in the southern hemisphere near Toowoomba, Australia<br />
* Mt. Lemmon Observatory in Arizona, near Tucson<br />
<br />
<br />
At both observatories we have identical 0.5 meter (20 inch) diameter "CDK20" research telescopes operating with a CCD camera and a selection of filters. These are the primary telescopes for this program. We offer other instruments too, and we are developing the capability to have you visit us through the web when the telescopes are in operation and to press the "shutter button" yourself. For example there are <br />
<br />
* A fast wide field Shared Skies Live telescope at Moore Observatory to take images of stars and nebulae through special filters.<br />
* A special long-focus telescope at Moore Observatory to take images of planets<br />
* Color cameras to quickly image star patterns, constellations, or comets that may span many degrees.<br />
* Two 0.6 meter (24 inch) "RC24" research telescope at Moore Observatory and Mt. Lemmon are used primarily to study planets around other stars<br />
* A 0.7 meter (27 inch) "CDK700" telescope at Mt. Kent observatory that will be used for spectroscopy, the analysis of starlight, for assisting the new NASA TESS satellite that is to launch soon.<br />
<br />
<br />
Also we have a substantial improving archive of the best images that we add to often. We will draw from it when we can to satisfy your requests.<br />
<br />
<br />
== What the Telescopes Can Show ==<br />
<br />
<br />
Except the color cameras used to record wide field images of constellations and the occasional comet, the CCD cameras on the telescopes return scientific images as digital files in “FITS” format. These images may be viewed in your computer's browser with some tools we will provide, or with other software you load on your own computer. those images require some effort to analyze, but they offer quantitative data on positions and brightnesses. Over the next few weeks you will be using some of the software that is needed and gain experience with what it can do.<br />
<br />
In many cases we will produce color images from these data that may be useful if you are interested in seeing form or structure, watching craters or shadows on the Moon, following the rotation of Saturn, Jupiter or Mars, or looking for the colors of stars. The color images often are not "true" color but a combination of different filters to highlight some aspects of the objects of interest. You can make them too by working with the original data.<br />
<br />
Except for the special wide field cameras, these telescopes cover a field less than about half a degree across, the apparent size of the Moon. Each pixel resolves about 0.5 arcsecond (there are 3600 arcseconds in a degree), and measures how much light arrived at that spot in the image for the duration of the exposure. By comparing one pixel with another, you can tell how much brighter or fainter one feature of the image is compared to another. The image that you see, while visually exciting, is also a tool to measure how much light there is and where it is.<br />
<br />
Some images may be returned to you with a calibration for the position in the sky, and as you explore them you can measure the celestial coordinates of any point in the image. In this way, you can identify individual stars, clusters, nebulae, and galaxies in the images. You can follow the changing positions of asteroids and satellites of planets. You can spot a new supernova, watch variable stars, measure the separation of double stars, measure the diameter of a distant galaxy, or follow a new comet. <br />
<br />
<br />
<br />
== Filters, Sensitivity, and Spatial Resolution ==<br />
<br />
<br />
With the scientific cameras each image is taken through a filter that isolates a narrow band of the spectrum by passing the light through a filter that removes all but the part we want to record. If you looked through one of the filters, you would see only part of the light that is collected by the telescope. A "blue" filter would should light our eyes sense as blue, while an "infrared" filter would should light our eyes cannot see at all. Here’s how we designate the filters that are usually available by the wavelengths they transmit:<br />
<br />
* Blue-Green: g' (400 to 530 nm)<br />
* Yellow-Red: r'(530 to 700 nm)<br />
* Hydrogen: H-alpha (656 nm)<br />
* Red-Near Infrared: i' (700 to 825 nm)<br />
* Infrared: z' (825 1100 nm)<br />
<br />
<br />
The numbers are the wavelength of light in the spectrum, from blue light at 450 nanometers (nm) to red light at 650 nm (nm). Infrared has a longer wavelength than red, and ultraviolet has a shorter wavelength than blue. Since our telescopes respond well to "near" infrared light, out to 1100 nm, but not well to ultraviolet light, we observe in the bands that are most efficiently detected by the optics and electronics. <br />
<br />
A measurement with different filters allows us to determine the “color” of a star, or we can put images from blue, green, and red together to make a color image that will resemble what you would see if your eyes could detect this faint light. Images through a filter that isolates light emitted by hydrogen gas would show few stars, and if you could "see" through this filter the scene would look dark red and show only the gas. You may read more about all the filters that are available here.<br />
<br />
The faintest stars you will find in most images are about 18th magnitude. These stars are more than 10,000,000 times fainter that the brightest stars in the night sky. Typically the telescoped cover a field of about 1/2 degree, the diameter of the full Moon, but in some cases it can be smaller to see very fine detail,or larger, to get an entire constellation or a big comet. If the air is steady, the smallest detail you will see is about 1 arcsecond across. For comparison, Jupiter appears about 40 arcseconds across in our sky, and the Andromeda galaxy extends thousands of arcseconds. Some of our planetary and lunar images show detail as small as 0.3 arcseconds, the resolution limit of our telescopes in perfectly stable air.<br />
<br />
<br />
== Objects == <br />
<br />
Within our solar system you could expect to see<br />
<br />
* Solar System<br />
** the occasional bright comet and many very faint ones<br />
** changing phases of the Moon, and its libration<br />
** craters on the Moon and shadows that move during the night<br />
** "earthshine", the light on the dark side of the Moon reflected from Earth<br />
** Venus, Mars, Jupiter, Saturn, Uranus and Neptune moving nightly across the sky with respect to stars<br />
** polar caps and large features on Mars<br />
** satellites of Jupiter, Saturn, Uranus and Neptune moving nightly<br />
** changing atmospheric features on Jupiter and Saturn<br />
** rings of Saturn<br />
** asteroids<br />
** the brighter dwarf planets like nearby Ceres and distant Pluto<br />
<br />
Within our Milky Way galaxy you could see<br />
<br />
* Galactic <br />
** star birth nebulae<br />
** open clusters of young stars<br />
** active stars that erupt and change brightness<br />
** double stars in orbit around one another (but not so fast that you would see the motion) unless you come back several years later<br />
** variable stars that pulsate, and pairs of eclipsing binary stars that change their brightness periodically<br />
** stars moving slowly across the sky by comparing old and recent images<br />
** planetary nebulae surrounding dying stars<br />
** globular clusters of very old stars<br />
<br />
Beyond our galaxy there are<br />
<br />
* Extragalactic<br />
** nearby companion galaxies like the Large and Small Magellanic Clouds<br />
** clusters of galaxies like those in Virgo and Coma<br />
** active galaxies with blackholes in their nuclei<br />
** other fainter galaxies out to distances of over 100 million light years<br />
** quasars out to distances of billions of light years<br />
** the occasional new supernova in a distant galaxy<br />
<br />
However, what is available to see depends on the time of year (where Earth is in its orbit), where the planets are in their orbits, and whether you are using a telescope in the northern or southern hemisphere.<br />
<br />
<br />
<br />
== How to Proceed ==<br />
<br />
<br />
If you are in the astronomy lab on campus and working in a small group (usually 3 students), you may make a decision as a group about what to request. This is easiest for us too, since we have fewer requests to handle that way. However, if you prefer you may work on your own too.<br />
<br />
This project has two parts that are graded separately. The first part you need to do this week is to define a problem and propose an observation. After that, we will find data for you in our archive, or even acquire data with our telescopes over the following weeks. Once we have something useful, we will provide it to you and, where you need it, assist you with understanding what it has to offer. Our expectation is to have data back to you in early April, and the last activity of this semester will be to tell us what you can find in it. When the data are available there will be an announcement, probably a comment on the discussion forum, and the last activity page will open up on the class website too.<br />
<br />
* Step one: Decide What to Observe<br />
<br />
<br />
The first big step is to decide what you would like to do. Almost always, this is a very challenging exercise because there is so much to chose from. There is no right answer. Here is a guide to help you think about it. Also, before submitting your request, we welcome a discussion on line or email if you need advice.<br />
<br />
'''What would I like to know more about that I could expect to “see” with one of these telescopes?'''<br />
<br />
Our telescopes can provide data allowing you to measure how stars vary in brightness ("variable stars"), to follow the motions of satellites of Jupiter, Saturn, Uranus, and Neptune, track asteroids, and capture the latest new supernova or recently discovered comet. When the skies are dark and the Moon is not out, they can record faint nebulae and distant galaxies too. We offer views of the clouds of Jupiter and the rings and atmosphere of Saturn in better detail by selecting the very best images that are not blurred by Earth's atmosphere. Our wide field telescopes show faint nebulae and star clusters that span several degrees on the sky.<br />
<br />
Use your imagination!! Satisfy your curiosity by selecting an object or objects, and the sort of data you would like to have on it. Once you have made a decision, you have to submit your request so that we know what you want, and you have to answer questions about your request.<br />
<br />
<br />
First, check that the object of interest is visible to us now unless you want what we already have. Indeed, our best planetary images are ones we have selected from among thousands taken, so even if a planet is not favorably placed now we can probably offer something. However currently the choices in our own solar system are<br />
<br />
* Observable Solar System Objects<br />
** Venus, still visible in the evening sky and rapidly moving closer and in line with the Sun. <br />
** Jupiter, visible in the morning sky, and increasingly well placed to observe as we go into the spring season, has beautiful bands and changing detail, as well as bright satellites.<br />
** Moon, as always, spectacular in detail and visible to us almost any night with various craters, mountains and mare<br />
** An occasional comet. Ones close to the sun and bright are difficult to work with, but fainter ones with obscure names that do not make the news are almost always around<br />
** Bright asteroids. Since there are thousands we have orbits for, there are always many we can follow as they move. Of course no detail is observable with a telescope.<br />
<br />
You might use Stellarium, for example, to see what is in the sky now and later this spring, or the on-line tool [http://n-the-sky.org n-the-sky.org]. [http://www.sky-map.org/ Sky-Map] and [http://aladin.u-strasbg.fr/AladinLite/ Aladin-Lite] will let you explore images and data from professional observatories on the web. You could also use Google simply to search for more information about your proposed target. If your request is inappropriate for what we can do, we will work with you to help refine your selection.<br />
<br />
We will make the selection of the telescope and other resources for you based on what you tell us about your request. For example, if it is well below the celestial equator and seen only in the southern hemisphere we will use a telescope at Mt. Kent, and if it is better seen from the northern hemisphere, we will use one of the telescopes at Moore Observatory in Kentucky. The Moon, Jupiter and Saturn are best recorded with the special purpose planetary telescope at Moore Observatory. Objects that cover a wide area of sky may be better seen in the wide field telescope or even the color cameras, but most requests will be directed to the 20-inch "CDK" telescopes. You are welcome to request a specific telescope and if it is available and appropriate we will try to use it for your data.<br />
<br />
<br />
* '''Step 2: Register Your Request''' <br />
<br />
Complete the simple form at the link below.<br />
<br />
[http://www.astro.louisville.edu/moorerequests Telescope Use Request Form] at [http://www.astro.louisville.edu/moorerequests http://www.astro.louisville.edu/moorerequests]<br />
<br />
We use your responses to schedule our telescopes, so please complete it while you are in the lab. Make a note to yourself of your request, and keep some notes about what you responded since they will help you complete the second part later. If you are working with others we only need one on-line request. In the comments section, if you note who is working together it will simplify sorting out the requests and responses later.<br />
<br />
<br />
<br />
<br />
For this lab we need the following from each student.<br />
<br />
<br />
* '''Step 3: Respond on the Answer Sheet for This Lab."''' <br />
** Object you have selected by an identification we can use with the telescopes.<br />
** A brief statement of what that object is.<br />
** Where is it in the sky? Correct celestial coordinates are an ideal response.<br />
** Is it observable now during the night from either southern or northern hemispheres.<br />
** Which telescope do you think would be useful for this. Explain your answer, or if you do not know which telescope, tell us what would affect the choice of telescope based on your choice of object.<br />
** Provide a concise statement of what you expect to learn about your selected object from whatever data we can offer.<br />
** Your preferred email address so that we can respond to you if needed<br />
** The names of the other students working with you on the same object<br />
<br />
For this lab it will help us a lot if you put your '''preferred email address''' on the form you submit to the assistant, and list the names of those who are working with you on this. Should groups re-organize later in the semester or you find your self working with someone else, this will insure we can get data to you.<br />
<br />
After you have submitted the request, then each student should answer the questions asked here on the usual lab response form and give them to the lab assistant. He may have an immediate suggestion about your choice and you probably should discuss it with him before you get to this last step. Remember that we need both the answers from you on the usual lab sheet, and the completed web submission from your group while you are in class today.<br />
<br />
<br />
== What Happens Later ==<br />
<br />
<br />
Once your data are available we will provide a link from which you can download images and there will be another activity in class for you to use in completing the work. Expect it a few weeks before the end of the semester, or before if the weather is good to us.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Very_simple_Python&diff=2551Very simple Python2018-07-07T22:26:23Z<p>WikiSysop: </p>
<hr />
<div>In this section of our [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy short course on Python for Physics and Astronomy] we take a short path to using Python easily.<br />
<br />
<br />
<br />
== Installing Python on your computer ==<br />
<br />
Python is open source software available for free from [http://www.python.org/ www.python.org]. Version 2.7 is the aging mature version that is widely supported by other add-on modules. Python 3 is more recent, largely compatible with 2.7, and is now widely used with packages for specific disciplines. New installations should be Python 3, but there's not much loss of functionality with the older Python 2.7 if you already have it. We will use Python 3 for the examples, though some of the earlier files here may still require small changes to run under 3. <br />
<br />
<br />
'''Linux'''<br />
<br />
Python will already be installed on your computer. Typically the operating system may use 2.7 for some of its core applications, and provide a basic 3.4, 3.5 or 3.6 for newer work. You may use your package manager to update and add to the base installations, but note the distinction between Python2 and Python3 which may co-exist. Check that the pip you are using on the command line is the one to add to the version you want to use. For example, look at <br />
<br />
ls -l /usr/bin/python*<br />
ls -l /usr/bin/pip*<br />
<br />
to see that may be already there and what will run with the default "python" command. A trick used by Linux systems is to have a directory /etc/alternatives that contains soft links, and there you may find links such as<br />
<br />
/etc/alternatives/pip -> /usr/bin/pip3.4<br />
<br />
to tell you how "pip" will run. There may be conflicts with the operating system's requirements and what you would want for your work, but with care they can be managed and you will be in control of your own destiny. If you prefer to let someone else drive, chose a Python distribution such as the popular [https://www.anaconda.com/distribution/ Anaconda] and follow the directions on their website. Keep in mind the potential conflicts with the already-installed python on your computer.<br />
<br />
<br />
<br />
'''Linux with Python from its source'''<br />
<br />
This method is for those who are comfortable with system management and want to maintain full control over their Python and its packages. It enables you to have the very latest Python and It also will minimalize the installation footprint on diskspace, while adding challenges of resolving conflicts between dependencies yourself, and some potentially vexing issues with the operating system conflicts. It is my favorite method.<br />
<br />
<br />
# Download the source tar file currently Python-3.6.4.tgz and as superuser or root copy to /usr/local/src<br />
# Untar the file and assign ownership of the new directory tree to yourself as an unpriviledged user<br />
# As a normal user, cd into the source directory and run ./configure <br />
# The defaults will be fine. Your new Python will go into the /usr/local/ directory. Some users prefer /opt, which can be changed as a configuration option.<br />
# make<br />
# make test<br />
# Now as root user --<br />
# make altinstall<br />
# ln -s /usr/local/lib64/python3.6/lib-dynload/ /usr/local/lib/python3.6/lib-dynload<br />
<br />
<br />
The altinstall option is necessary to avoid overwriting or interfering with the system python. The softlink is needed because some llibrary files in lib64 are not found without it. It is not necessary to assign either PYTHONHOME or PYTHONPATH, or to use an environment manager to have this version work independently of the system version. However, be aware that the functions you need are explicity in /usr/local/bin and that they refer to python by its version, that is ''python3.6'' and ''pip3.6'' Therefore if you later update the OS and it also has these executables, there's a potential conflict that would be resolved by the search path and could be ambiguous.<br />
<br />
Similarly, if you install Anaconda Python, it will have its own /opt directory tree to navigate, while Canopy Python may use environment variables. To run your own locally built Python ''echo PYTHONHOME'' and ''echo PYTHONPATH'' should return empty strings.<br />
<br />
<br />
'''Linux adding modules by pip'''<br />
<br />
<br />
For installing in the system python, if you need to update the complex matplotlib package for Python 3 <br />
that may lack parts you need, it must be removed first:<br />
<br />
pip uninstall matplotlib<br />
<br />
Then re-install it and specify not to use the saved source if any.<br />
<br />
pip install matplotlib --upgrade --no-cache-dir<br />
<br />
Also for a system python version you may need to do this <br />
<br />
pip uninstall six<br />
<br />
pip install six --upgrade --no-cache-dir<br />
<br />
<br />
Now if you are building a Python for science, use the specific pip for it and add the modules you need. This may include several that were installed on the system using yast, as well the matplotlib ones and these. Start with these since pip will resolve dependencies, probably use cached source unless you tell it not to, and in the process grow the missing branches of your Python tree. Later, if you find something missing, you can add it as needed.<br />
<br />
<br />
Install numpy (pip install numpy)<br />
<br />
Install scipy (pip install scipy)<br />
<br />
Install astropy (pip install astropy) for essential astronomy utilities<br />
<br />
Install scikit-image (pip install scikit-image) for image processing<br />
<br />
Install ginga (pip install ginga) for FITS viewer and core modules<br />
<br />
Install pyastronomy (pip install pyastronomy) or from source on github [https://github.com/sczesla/PyAstronomy pyastronomy]<br />
<br />
Install pyephem (pip install pyephem) for astronomical ephemerides<br />
<br />
Install healpy (pip install healpix) for astronomical image processing<br />
<br />
Install reproject (pip install reproject) for image reprojection when doing fits conversion<br />
<br />
Install quantities (pip install quantities) to have physical constants<br />
<br />
Install emcee (pip install emcee) to have an MCMC library <br />
. <br />
<br />
Lastly, install the software chain for data visualization with Python using pip rather than the system package because Pandas is developing rapidly<br />
<br />
Install pandas (pip install pandas)<br />
<br />
Install scrapy (pip install scrapy)<br />
<br />
Install requests (pip install requests)<br />
<br />
<br />
<br />
'''Windows'''<br />
<br />
For Windows there are several choices.<br />
<br />
* [https://www.python.org/downloads/windows/ Python.org] provides installers for Windows. The web-based installer will update software components from the web. You may need administrator privileges to update system libraries.<br />
* [https://www.enthought.com/academic-subscriptions/ Enthought Canopy] is a commercial distribution that is free to download, and for a fee will offer support. It is intended for scientific computing and can co-exist with the system Python of Linux. <br />
* [https://www.anaconda.com/distribution/ Anaconda] is widely used in Astronomy, and will come with all the packages you will need to get started. It uses a "conda" package management system. <br />
<br />
<br />
<br />
'''Mac OSX'''<br />
<br />
* Python 2.7 comes installed with OSX. Try "python --version" from a terminal command line and see what happens. You can update this installation from Python.org (see next), or add package with pip given adminsitrative authority. Be aware of the potential Tkl library problem though.<br />
* [https://www.python.org/downloads/mac-osx/ Python.org] has installers for recent Mac OS variants. However, there are problems with the Tkl libraries provided in by Apple, particularly when used for graphics and in the development environment IDLE, which you should be aware of. Read the notice [https://www.python.org/download/mac/tcltk/ here].<br />
* [https://store.enthought.com/downloads/ Enthought Canopy Express] is free for Mac users too. Enthought provides all the packages in one installation process, and additonal support for a fee.<br />
* [https://www.anaconda.com/distribution/ Anaconda] also has a Mac version, and is very popular.<br />
<br />
<br />
Those with an astronomical interest may benefit from [http://python4astronomers.github.com/installation/python_install.html Python4Astronomers]<br />
<br />
Most users would probably prefer running Python through the [http://docs.python.org/2/library/idle.html IDLE] integrated development environment. This provides an editor and file management, along with help and syntax highlighting. It's named after Eric Idle, who does the [http://www.youtube.com/watch?v=uo6OCxwUPPg "Galaxy Song"] in Monty Python. On the command line you would simple run "idle" to get started. <br />
<br />
<br />
Additional modules would have to be installed separately later if they are not part of the original installation. Python has its own ''pip'' (see above for Linux) for adding features which makes that easy. The ones you will need for scientific programming are <br />
<br />
* NumPy Test with "import numpy from within interactive Python or idle.<br />
* SciPy Test with "import scipy".<br />
* AstroPy This one for astronomers. Test with "import astropy".<br />
<br />
and there are others, especially from [https://www.scipy.org/scikits.html Scikits]<br />
<br />
Anaconda and Enthought distributions will have everything you need "out of the box."<br />
<br />
<br />
'''AstroConda for Linux and Mac OSX'''<br />
<br />
If you are primarily interested in using Python for astronomy and have a need for the tools of the Space Telescope Science Institute, consider adding their astronomical code to an Anaconda distribution. At this site<br />
<br />
lhttp://astroconda.readthedocs.io/en/latest/ AstroConda] <br />
<br />
there is a guide to installation and the documentation on its use. If you are unfamiliar with Python, take the time to go through our short course and try some examples first, and then return and fill in your system with AstroConda and you will be ready for analyzing data from MAST, HST and other sources. AstroConda is for Linux and OSX, but Microsoft Windows is not supported. If you have a Windows computer with a lot of memory and disk space, you could add a Linux virtual machine inside our Windows operating system, as a safe simple way to have Linux features available within your preferred OS. Virtualbox is free and installation from Oracle is only a click away:<br />
<br />
[https://www.virtualbox.org/ Oracle VirtuaBox]<br />
<br />
It delivers enough processing power of the host computer that for many applications it is as good as running a "real" machine, and it protects your own operating system while you experiment with new ones. <br />
<br />
<br />
<br />
== IDE's and Editors and Python environments ==<br />
<br />
Keep in mind that Python itself is a programming language and system. It stands on its own, and it can be incorporated into other more complex, and potentially more useful interfaces. At a minimum you will need a text editor. With that you can read and write program files, and run them as a program either by having "python" read the file, or by making the file itself executable (on Linux and Mac). Most editors on these operating systems will be fine, but some are cumbersome for learning. Unless you happen to have the skill, avoid "vi" and even "emacs", two common editors of Unix-like systems like OSX and LInux, and use something with a lighter interface. The java-based "jedit" is free, easy to install, and has some helpful features. Since it is based on java, it runs on Mac, Windows and Linux with the same look and feel. You can obtain it from<br />
[http://www.jedit.org/ http://www.jedit.org/] and follow their installation instructions.<br />
<br />
The integrated development environment (IDE) "idle" is also very nice to start with, and recommended. It may be present after you install Python, so try the command "idle" in a terminal window and see what happens. There are said to be problems with its use of the Tkl library and the Mac OSX installed libraries, but they should be solved in the most recent releases of Python and OSX supplied by Enthought or Anaconda.<br />
<br />
Now widely used and with great potential, the Jupyter system has been under development for a decade and is mature. You can read about it and even preview its capabilities on the web. Keep in mind if you decide to start at that level, that the system is feature-filled, and that once you create content within the system it will require the system to use that content. That is, unlike a simple Python program, the notebooks created by Jupyter are truly bodies of work that include data and analysis. It can be very useful in a lab, for example, and for data documentation and exchange.<br />
<br />
<center> [http://jupyter.org/ http://jupyter.org/] </center><br />
<br />
Spyder is an IDE intended for serious program development. It has tools for Python sensitive editing and for trace analysis, and is recommended rather than idle or Jupyter when creating code is the main task. The website is <br />
<br />
<center> [https://pythonhosted.org/spyder/ https://pythonhosted.org/spyder/] </center><br />
<br />
If you are also documenting your work, as is good practice if you intend to use it long term or share it with colleagues, then Sphinx is a powerful tool and is often used within the Python community<br />
<br />
<center> [http://www.sphinx-doc.org/en/master/ http://www.sphinx-doc.org/en/master/] </center><br />
<br />
<br />
== Using Python for computer math and instead of Matlab, Maple, or Mathematica ==<br />
<br />
Python and its applications can replace most commercial packages, though not with one-to-one code compatibility. For example, the concept of Mathematica notebooks seems to be duplicated in the free open source Jupyter platform. Obviously, Mathematica offers unique powerful tools for computer algebra, equation solving, numerical computing, and graphics too. However most of those functions are available in open source tools. As a student or member of the educational academic community you may have access to site or educational licensing for these commercial systems. Keep in mind, that once you are out of that environment, the full cost of using them comes into play. Free is good, as in these alternatives to well-known scientific programming systems<br />
<br />
*[http://www.sagemath.org/library-why.html http://www.sagemath.org/] SageMath built with Python - use instead of Mathematica or Maple<br />
*[https://www.gnu.org/software/octave/ https://www.gnu.org/software/octave/] GNU Octave - a Matlab-like system that is not Python<br />
*[https://github.com/gnudatalanguage/gdl https://github.com/gnudatalanguage/gdl] GNU GDL - an open source replacement for IDL<br />
<br />
<br />
<br />
<br />
<br />
== Using Python in real time ==<br />
<br />
The first step is to figure out how to start up Python on your computer after it is installed. In Linux you open a console and type "python" on the command line. You'll immediately see a prompt that looks like ">>" after which you can type Python code and see the results.<br />
<br />
If you installed the Enthought distribution of Python on Windows or Mac, take a look at their release notes and website for additional advice on getting started. <br />
<br />
If you installed from the python.org, then they have some additional pages to offer help.<br />
On Windows, its not necessarily as straightforward as Linux, but it can be. It will help to read this [http://docs.python.org/2/faq/windows.html "frequently asked question" (FAQ) page] about Python on Windows to help you at first, and also consult [http://docs.python.org/2/using/windows.html setup and usage guide].<br />
On a Macintosh OS X system using Python is very similar to other Unix platforms like Linux or BSD. There are some helpful notes at the [http://docs.python.org/2/using/mac.html Using Python on a Macintosh] website. <br />
<br />
Once you have a command line prompt you have access to all of Python's capabilities. We'll show you some simple examples [http://prancer.physics.louisville.edu/astrowiki/index.php?title=Python_examples here] to test your installation and give you a quick sense of how to use it.<br />
<br />
To exit Python in the interactive mode, use "Ctrl+d" or "exit()" from the command line, or the Exit menu entry if you are running IDLE.<br />
<br />
<br />
== Using Python code as a standalone program ==<br />
<br />
You will usually edit a file that contains your Python program and then run that program by calling the Python interpreter. Therefore, the first thing is to pick an appropriate editor. <br />
One way is to use IDLE, which makes it especially easy on Windows systems and others to edit and test with a consistent interface. On Linux systems where the command line is more commonly used, an alternative is a standard graphical editor that is aware of Python syntax like gedit. Other alternatives for Linux users are nedit and emacs, depending on your taste in interfaces. Python text files have a required format, and it generally not a good idea to embed tabs in the text so the tab function has to be set for spaces instead. The Python website maintains a [http://wiki.python.org/moin/PythonEditors list of editors] and their features for different operating systems with links to the editor websites if you need to download one. <br />
<br />
For example, if your program is in the file "myprogram.py" you can run it from the command line with "python myprogram.py". On Windows systems, the file extension ".py" may be associated with this command, and in that case you can start a program by clicking on the icon or name in a window. On MacOS and Linux, you would first make the file executable with a command such as <br />
<br />
chmod a+x myfile.py<br />
<br />
and also see that the first line of the file is exactly<br />
<br />
#!/usr/bin/python<br />
<br />
assuming that python is installed in /usr/bin/. With those changes, any file of Python code becomes an executable program. Simply type<br />
<br />
myfile.py<br />
<br />
Note that programs that interact with the window manager may need to be started with pythonw instead of python. For MacOS, see [http://docs.python.org/2/using/mac.html 4.1.1 How to run a Python script].<br />
<br />
<br />
== Examples of very simple Python ==<br />
<br />
<br />
For examples of Python illustrating how to use it interactively and to write very simple programs, see the section [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Python examples].<br />
<br />
<br />
<br />
== An assignment to try out very simple Python ==<br />
<br />
<br />
For the assigned homework to use very simple Python interactively and as a script, see the section [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Python assignments].</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2550User Interfaces2018-04-17T07:02:52Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Bokeh Server for Live Plotting of Python Data ===<br />
<br />
Lastly, we arrive at the destination: a solution to interactive plotting where data are created and modified in Python, and presented to the user on the fly, with a customized and responsive interface. The interface components can be entirely in the browser, and thereby potentially offered to a web client, or they can be shared between the browser and a Python GUI, for desktop applications. There are three components:<br />
<br />
* Python backend, perhaps with Tk or other GUI or responding to CGi requests from another server<br />
* Bokeh server responding to the Python and presenting information to the browser<br />
* Browser client, listening to the Bokeh server directly or through a proxy, and providing data to the Python backend if needed<br />
<br />
For details on how to set this up and its possible uses, see<br />
<br />
[https://bokeh.pydata.org/en/latest/docs/user_guide/server.html Running a Bokeh Server]<br />
<br />
Several examples are offered here<br />
<br />
[https://demo.bokehplots.com/ demo.bokehplots.com]<br />
<br />
The first live example shown there is from "sliders.py", a copy of which is in our [http://prancer.physics.louisville.edu/classes/650/python/examples examples directory]. <br />
<br />
<br />
# Load the modules<br />
<br />
import numpy as np<br />
<br />
from bokeh.io import curdoc<br />
from bokeh.layouts import row, widgetbox<br />
from bokeh.models import ColumnDataSource<br />
from bokeh.models.widgets import Slider, TextInput<br />
from bokeh.plotting import figure<br />
<br />
# Set up data using numpy<br />
<br />
N = 200<br />
x = np.linspace(0, 4*np.pi, N)<br />
y = np.sin(x)<br />
<br />
# Set up the display data source<br />
<br />
source = ColumnDataSource(data=dict(x=x, y=y))<br />
<br />
# Set up plot<br />
<br />
plot = figure(plot_height=400, plot_width=400, title="my sine wave",<br />
tools="crosshair,pan,reset,save,wheel_zoom",<br />
x_range=[0, 4*np.pi], y_range=[-2.5, 2.5])<br />
<br />
plot.line('x', 'y', source=source, line_width=3, line_alpha=0.6)<br />
<br />
# Set up widgets similar to Tk<br />
<br />
text = TextInput(title="title", value='my sine wave')<br />
offset = Slider(title="offset", value=0.0, start=-5.0, end=5.0, step=0.1)<br />
amplitude = Slider(title="amplitude", value=1.0, start=-5.0, end=5.0, step=0.1)<br />
phase = Slider(title="phase", value=0.0, start=0.0, end=2*np.pi)<br />
<br />
# Add interactive tools<br />
<br />
freq = Slider(title="frequency", value=1.0, start=0.1, end=5.1, step=0.1)<br />
<br />
# Set up callbacks to the widgets<br />
<br />
def update_title(attrname, old, new):<br />
plot.title.text = text.value<br />
<br />
text.on_change('value', update_title)<br />
<br />
def update_data(attrname, old, new):<br />
<br />
# Get the current slider values<br />
a = amplitude.value<br />
b = offset.value<br />
w = phase.value<br />
k = freq.value<br />
<br />
# Generate the new curve<br />
x = np.linspace(0, 4*np.pi, N)<br />
y = a*np.sin(k*x + w) + b<br />
<br />
source.data = dict(x=x, y=y)<br />
<br />
for w in [offset, amplitude, phase, freq]:<br />
w.on_change('value', update_data)<br />
<br />
<br />
# Set up layouts and add to document<br />
inputs = widgetbox(text, offset, amplitude, phase, freq)<br />
<br />
curdoc().add_root(row(inputs, plot, width=800))<br />
curdoc().title = "Sliders"<br />
<br />
<br />
Download the file sliders.py or copy the source shown, and on a computer that has Python 3 and Bokeh installed, use the command line<br />
<br />
bokeh server sliders.py<br />
<br />
to initiate a live session in the server. Once that has started, open your browser to "localhost", that is to your own computer, by entering this on the browser source line<br />
<br />
http://localhost:5006/sliders<br />
<br />
The display will look like this, except the sliders will cause changes in the plot.<br />
<br />
<br />
[[File:Sliders.png]]<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2549User Interfaces2018-04-17T06:57:35Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Bokeh Server for Live Plotting of Python Data ===<br />
<br />
Lastly, we arrive at the destination: a solution to interactive plotting where data are created and modified in Python, and presented to the user on the fly, with a customized and responsive interface. The interface components can be entirely in the browser, and thereby potentially offered to a web client, or they can be shared between the browser and a Python GUI, for desktop applications. There are three components:<br />
<br />
* Python backend, perhaps with Tk or other GUI or responding to CGi requests from another server<br />
* Bokeh server responding to the Python and presenting information to the browser<br />
* Browser client, listening to the Bokeh server directly or through a proxy, and providing data to the Python backend if needed<br />
<br />
For details on how to set this up and its possible uses, see<br />
<br />
[https://bokeh.pydata.org/en/latest/docs/user_guide/server.html Running a Bokeh Server]<br />
<br />
Several examples are offered here<br />
<br />
[https://demo.bokehplots.com/ demo.bokehplots.com]<br />
<br />
The first live example shown there is from "sliders.py", a copy of which is in our [http://prancer.physics.louisville.edu/classes/650/python/examples examples directory]. <br />
<br />
import numpy as np<br />
<br />
from bokeh.io import curdoc<br />
from bokeh.layouts import row, widgetbox<br />
from bokeh.models import ColumnDataSource<br />
from bokeh.models.widgets import Slider, TextInput<br />
from bokeh.plotting import figure<br />
<br />
# Set up data<br />
N = 200<br />
x = np.linspace(0, 4*np.pi, N)<br />
y = np.sin(x)<br />
source = ColumnDataSource(data=dict(x=x, y=y))<br />
<br />
<br />
# Set up plot<br />
plot = figure(plot_height=400, plot_width=400, title="my sine wave",<br />
tools="crosshair,pan,reset,save,wheel_zoom",<br />
x_range=[0, 4*np.pi], y_range=[-2.5, 2.5])<br />
<br />
plot.line('x', 'y', source=source, line_width=3, line_alpha=0.6)<br />
<br />
<br />
# Set up widgets<br />
text = TextInput(title="title", value='my sine wave')<br />
offset = Slider(title="offset", value=0.0, start=-5.0, end=5.0, step=0.1)<br />
amplitude = Slider(title="amplitude", value=1.0, start=-5.0, end=5.0, step=0.1)<br />
phase = Slider(title="phase", value=0.0, start=0.0, end=2*np.pi)<br />
freq = Slider(title="frequency", value=1.0, start=0.1, end=5.1, step=0.1)<br />
<br />
<br />
# Set up callbacks<br />
def update_title(attrname, old, new):<br />
plot.title.text = text.value<br />
<br />
text.on_change('value', update_title)<br />
<br />
def update_data(attrname, old, new):<br />
<br />
# Get the current slider values<br />
a = amplitude.value<br />
b = offset.value<br />
w = phase.value<br />
k = freq.value<br />
<br />
# Generate the new curve<br />
x = np.linspace(0, 4*np.pi, N)<br />
y = a*np.sin(k*x + w) + b<br />
<br />
source.data = dict(x=x, y=y)<br />
<br />
for w in [offset, amplitude, phase, freq]:<br />
w.on_change('value', update_data)<br />
<br />
<br />
# Set up layouts and add to document<br />
inputs = widgetbox(text, offset, amplitude, phase, freq)<br />
<br />
curdoc().add_root(row(inputs, plot, width=800))<br />
curdoc().title = "Sliders"<br />
<br />
<br />
Download the file sliders.py or copy the source shown, and on a computer that has Python 3 and Bokeh installed, use the command line<br />
<br />
bokeh server sliders.py<br />
<br />
to initiate a live session in the server. Once that has started, open your browser to "localhost", that is to your own computer, by entering this on the browser source line<br />
<br />
http://localhost:5006/sliders<br />
<br />
The display will look like this, except the sliders will cause changes in the plot.<br />
<br />
<br />
[[File:Sliders.png]]<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2548User Interfaces2018-04-17T06:42:34Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Bokeh Server for Live Plotting of Python Data ===<br />
<br />
Lastly, we arrive at the destination: a solution to interactive plotting where data are created and modified in Python, and presented to the user on the fly, with a customized and responsive interface. The interface components can be entirely in the browser, and thereby potentially offered to a web client, or they can be shared between the browser and a Python GUI, for desktop applications. There are three components:<br />
<br />
* Python backend, perhaps with Tk or other GUI or responding to CGi requests from another server<br />
* Bokeh server responding to the Python and presenting information to the browser<br />
* Browser client, listening to the Bokeh server directly or through a proxy, and providing data to the Python backend if needed<br />
<br />
For details on how to set this up and its possible uses, see<br />
<br />
[https://bokeh.pydata.org/en/latest/docs/user_guide/server.html Running a Bokeh Server]<br />
<br />
Several examples are offered here<br />
<br />
[https://demo.bokehplots.com/ demo.bokehplots.com]<br />
<br />
The first live example shown there is from "sliders.py", a copy of which is in our [http://prancer.physics.louisville.edu/classes/650/python/examples examples directory]. Download the file, and on a computer that has Python 3 and Bokeh installed, use the command line<br />
<br />
bokeh server sliders.py<br />
<br />
to initiate a live session in the server. Once that has started, open your browser to "localhost", that is to your own computer, by entering this on the browser source line<br />
<br />
http://localhost:5006/sliders<br />
<br />
The display will look like this, except the sliders will cause changes in the plot.<br />
<br />
<br />
[[File:Sliders.png]]<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Python_for_Physics_and_Astronomy&diff=2547Python for Physics and Astronomy2018-04-17T06:41:11Z<p>WikiSysop: </p>
<hr />
<div>The Python programming language is a widely used tool for basic research and engineering. Its rapid rise in popularity is supported by comprehensive, largely open-source, contributions from scientists who use it for their own work. This short course offers an introduction to Python with examples drawn from physics and astronomy. <br />
<br />
<br />
This resource was developed as a component of a<br />
[http://prancer.physics.louisville.edu/astrowiki/index.php/Research_Methods Research Methods] class Various examples that may be useful for<br />
developing small Python programs are collected [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples here]. They are a basis for a few exercises that were assigned during the course are are available [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments here].<br />
<br />
This resource is maintained so that it is reasonable current with the latest releases of Python 3 and component modules. The code discussed has been tested in Python 3.6, though some pieces of older code may still be lurking, they should be recognizable and easily modified if errors occur.<br />
<br />
<br />
The topics and examples covered --<br />
<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Programming_for_Physics_and_Astronomy Why program? Choosing a language.] <br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Very_simple_Python Very simple Python]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Very_simple_Python#Installing_Python_on_your_computer Installing it on your computer]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Very_simple_Python#IDE.27s_and_Editors_and_Python_environments Editors and environments]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Very_simple_Python#Using_Python_in_real_time Using it in real time]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Very_simple_Python#Using_Python_code_as_a_standalone_program Using code as a standalone program]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Examples] <br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Assignments]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Elements_of_Python_programming Elements of Python programming]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Elements_of_Python_programming#Input_and_output Input and output]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Elements_of_Python_programming#Numbers.2C_text.2C_and_data_types Data types: numbers and strings]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Elements_of_Python_programming#Lists.2C_tuples.2C_dictionaries.2C_and_statements Lists, tuples, dictionaries, and statements]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Examples] <br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Assignments]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Solving_problems_with_Python Solving problems with Python]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Solving_problems_with_Python#Flow.2C_control Flow control]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Solving_problems_with_Python#Functions Functions]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Solving_problems_with_Python#Iteration Iteration]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Examples]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Assignments]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python Graphical User Interfaces with Python]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python#A_Tk_Tutorial A Tk Tutorial]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python#Building_a_Program Building a Program]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python#Interfacing_an_Instrument:_Phidgets Interfacing an Instrument]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python#Events_and_Control_Within_the_User_Interface Events and Control]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python#Widgets Widgets]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python Graphics with Python]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python#Installation_of_matplotlib Matplotlib]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python#Learning_the_basics_of_2D_data_and_function_plotting Learning the basics of 2D data and function plotting]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python#Interactive_plotting Interactive plotting]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python#A_little_3D_plotting A little 3D plotting]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python#Bokeh Bokeh]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Examples]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Assignments]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits NumPy, SciPy and SciKits]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#NumPy NumPy]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Arrays Arrays]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Indexing Indexing]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Functions_Broadcasting_Over_an_Array Functions and Broadcasting]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Matrix_and_Vector_Math_in_NumPy Matrix and Vector Math]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Fourier_Transforms_in_NumPy Fourier Transforms in NumPy]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#SciPy_and_SciKits SciPy and SciKits]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Interpolation Interpolation]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Integration Integration]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Differentiation Differentiation]<br />
###[http://prancer.physics.louisville.edu/astrowiki/index.php/NumPy,_SciPy_and_SciKits#Statistics Statistics]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Examples]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Assignments]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy Image processing with Python and SciPy]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy#Python_Imaging_Library_-_PIL Python Imaging Library - PIL]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy#Images_with_NumPy_and_SciPy Images with NumPy and SciPy]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy#Astronomical_FITS_Files Astronomical FITS Files]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy#Other_Processing Other Processing]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy#SciKits SciKits]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy#AstroImageJ_and_Alsvid AstroImageJ and Alsvid]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Examples]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Assignments]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/User_Interfaces User Interfaces]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/User_Interfaces#Command_Line_Interfacing_and_Access_to_the_Operating_System Command Line Interfacing and Access to the Operating System] <br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/User_Interfaces#Graphical_User_Interface_to_Plotting Graphical User Interface to Plotting]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/User_Interfaces#Running_a_Bokeh_Server_for_Live_Plotting_of_Python_Data Running a Bokeh Server for LIve Plotting of Python Data]<br />
##[http://prancer.physics.louisville.edu/astrowiki/index.php/User_Interfaces#Running_a_Server_for_Javascript_in_a_Browser_Engine Running a Server for Javascript in a Browser Engine]<br />
#[http://prancer.physics.louisville.edu/astrowiki/index.php/How_to_Create_a_Javascript_Program How to create a javascript program]<br />
<br />
<br />
These topics may be added to a longer version of this course or as working notes when time allows.<br />
<br />
#Bayesian methods with Python and Markov Chain Monte Carlo (MCMC) analyses<br />
#Real world interfacing<br />
##Instrumentation and communication<br />
##Serial ports<br />
##USB<br />
##Ethernet and TCPIP<br />
#Parallel processing <br />
##Using all the processors (CPUs) in your computer<br />
##Using graphical processing units (GPUs)<br />
##Artificial intelligence computing with tensor processing units (TPUs)<br />
#Working with the web<br />
##HTTP servers<br />
##Getting data from servers<br />
##Sending data to servers<br />
##Using Python with the Common Gateway Interface (CGI) <br />
##Programming for server-side processing<br />
#Python and other languages<br />
##Bash scripting in Unix-like systems<br />
##Gnu Data Language (GDL) as a replacement for IDL or bridge to Python<br />
##Very simple C<br />
##Connecting Python to the browser engine<br />
##Chrome and Firefox for web development<br />
##Java for astronomical calculations: AstroCC and AstroImageJ</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=File:Sliders.png&diff=2546File:Sliders.png2018-04-17T06:35:24Z<p>WikiSysop: An example of a Bokeh server Python-generated interactive plot with sliders.</p>
<hr />
<div>An example of a Bokeh server Python-generated interactive plot with sliders.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2545User Interfaces2018-04-17T06:34:16Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.<br />
<br />
<br />
=== Running a Bokeh Server for Live Plotting of Python Data ===<br />
<br />
Lastly, we arrive at the destination: a solution to interactive plotting where data are created and modified in Python, and presented to the user on the fly, with a customized and responsive interface. The interface components can be entirely in the browser, and thereby potentially offered to a web client, or they can be shared between the browser and a Python GUI, for desktop applications. There are three components:<br />
<br />
* Python backend, perhaps with Tk or other GUI or responding to CGi requests from another server<br />
* Bokeh server responding to the Python and presenting information to the browser<br />
* Browser client, listening to the Bokeh server directly or through a proxy, and providing data to the Python backend if needed<br />
<br />
For details on how to set this up and its possible uses, see<br />
<br />
[https://bokeh.pydata.org/en/latest/docs/user_guide/server.html Running a Bokeh Server]<br />
<br />
Several examples are offered here<br />
<br />
[https://demo.bokehplots.com/ demo.bokehplots.com]<br />
<br />
The first live example shown there is from "sliders.py", a copy of which is in our [http://prancer.physics.louisville.edu/classes/650/python/examples examples directory]. Download the file, and on a computer that has Python 3 and Bokeh installed, use the command line<br />
<br />
bokeh server sliders.py<br />
<br />
to initiate a live session in the server. Once that has started, open your browser to "localhost", that is to your own computer, by entering this on the browser source line<br />
<br />
http://localhost:5006/sliders<br />
<br />
The display will look like this, except the sliders will cause changes in the plot.<br />
<br />
<br />
[[File:Sliders.png]]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2544User Interfaces2018-04-17T06:21:31Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.<br />
<br />
<br />
=== Running a Bokeh Server for Live Plotting of Python Data ===<br />
<br />
Lastly, we arrive at the destination: a solution to interactive plotting where data are created and modified in Python, and presented to the user on the fly, with a customized and responsive interface. The interface components can be entirely in the browser, and thereby potentially offered to a web client, or they can be shared between the browser and a Python GUI, for desktop applications. There are three components:<br />
<br />
* Python backend, perhaps with Tk or other GUI or responding to CGi requests from another server<br />
* Bokeh server responding to the Python and presenting information to the browser<br />
* Browser client, listening to the Bokeh server directly or through a proxy, and providing data to the Python backend if needed<br />
<br />
For details on how to set this up and its possible uses, see<br />
<br />
[https://bokeh.pydata.org/en/latest/docs/user_guide/server.html Running a Bokeh Server]<br />
<br />
Several examples are offered here<br />
<br />
[https://demo.bokehplots.com/ demo.bokehplots.com]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2543User Interfaces2018-04-17T06:14:31Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.<br />
<br />
<br />
=== Running a Bokeh Server for Live Plotting of Python Data ===<br />
<br />
Lastly, we arrive at the destination: a solution to interactive plotting where data are created and modified in Python, and presented to the user on the fly, with a customized and responsive interface. The interface components can be entirely in the browser, and thereby potentially offered to a web client, or they can be shared between the browser and a Python GUI, for desktop applications. There are three components:<br />
<br />
* Python backend, perhaps with Tk or other GUI or responding to CGi requests from another server<br />
* Bokeh server responding to the Python and presenting information to the browser<br />
* Browser client, listening to the Bokeh server directly or through a proxy, and providing data to the Python backend if needed<br />
<br />
For details on how to set this up and for examples, see<br />
<br />
[https://bokeh.pydata.org/en/latest/docs/user_guide/server.html Running a Bokeh Server]</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=User_Interfaces&diff=2542User Interfaces2018-04-17T05:35:51Z<p>WikiSysop: </p>
<hr />
<div>As part of our short course on [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy Python for Physics and Astronomy] we consider how users interact with their computing environment. A programming language such as Python provides tools to build code that computes scientific models, captures data, sorts it and analyzes it largely without operator action. In effect, once you have written the program, you point it at the data or task it is to do, and wait for it to return new science to you. This is the command line, or batch, model of computing and is at the core of large data science today. Indeed, from your handheld devices to supercomputers, the work that is done is for the most part autonomous. We have seen how Python has built-in components to accept input from the command line, the operating system, the computer that is hosting the program, and the Internet or cloud. What about the other side, the user's perspective on computing?<br />
<br />
As an end user, would you prefer to move a mouse or tap a screen in order to select a file, or to type in the path and file name? What if you had to make operational decisions based on graphical output, or changing real world environments as data are collected? In modern computing, most of us interact with the machine and software through a graphical user interface or GUI. These tools create that option.<br />
<br />
'''On-Line Guides'''<br />
<br />
*[http://www.tkdocs.com/ Tk]<br />
*[https://matplotlib.org/users/index.html Matplotlib]<br />
*[https://bokeh.pydata.org/en/latest/docs/reference.html#refguide Bokeh]<br />
<br />
While the conventional Tk and Matplotlib components are foundational to Python, Bokeh is a very recent development with the design philosophy to put the web first for the end user and it has a contemporary look. It also enables adding widgets written for javascript within the web display, which can be be very effective. <br />
<br />
<br />
<br />
=== Command Line Interfacing and Access to the Operating System ===<br />
<br />
<br />
In a Unix-like enviroment (Linux or MacOSX), the command line is an accessible and often preferred way to instruct a program on what to do. A typical program, as we've seen, might start like this example to interpolate a data file and plot the result:<br />
<br />
#!/usr/bin/python<br />
<br />
import sys<br />
import numpy as np<br />
from scipy.interpolate import UnivariateSpline<br />
import matplotlib.pyplot as plt<br />
<br />
sfactorflag = True<br />
<br />
if len(sys.argv) == 1:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
elif len(sys.argv) == 4:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactorflag = False<br />
elif len(sys.argv) == 5:<br />
infile = sys.argv[1]<br />
outfile = sys.argv[2]<br />
nout = int(sys.argv[3])<br />
sfactor = float(sys.argv[4]) <br />
else:<br />
print " "<br />
print "Usage: interpolate_data.py indata.dat outdata.dat nout [sfactor]"<br />
print " "<br />
sys.exit("Interpolate data with a univariate spline\n")<br />
<br />
It uses "sys" to parse the command line arguments into text and numbers that control what the program will do. Because its first line directs the system to use the python interpreter, if the program is marked as executable to the user it will run as a single command followed by arguments. In this case it would be something like<br />
<br />
interpolate_data.py indata.dat outdata.dat nout sfactor<br />
<br />
where indata.dat is a text-based data file of x,y pairs, one pair per line, outdata.dat is the interpolated file, nout is the number of points to be interpolated, and sfactor is an optional floating point smoothing factor. When you run this it will read the files, do the interpolation without further interaction, and (as written) plot a result as well as write out a data file. The rest of the code is<br />
<br />
# Take x,y coordinates from a plain text file<br />
# Open the file with data<br />
infp = open(infile, 'r')<br />
# Read all the lines into a list<br />
intext = infp.readlines()<br />
# Split data text and parse into x,y values <br />
# Create empty lists<br />
xdata = []<br />
ydata = []<br />
i = 0 <br />
for line in intext: <br />
try:<br />
# Treat the case of a plain text comma separated entry <br />
entry = line.strip().split(",") <br />
# Get the x,y values for these fields<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except: <br />
try: <br />
# Treat the case of a plane text blank space separated entry<br />
entry = line.strip().split()<br />
xval = float(entry[0])<br />
yval = float(entry[1])<br />
xdata.append(xval)<br />
ydata.append(yval)<br />
i = i + 1 <br />
except:<br />
pass <br />
# How many points found? <br />
nin = i<br />
if nin < 1:<br />
sys.exit('No objects found in %s' % (infile,))<br />
<br />
<br />
# Import data into a np arrays <br />
x = np.array(xdata)<br />
y = np.array(ydata)<br />
<br />
<br />
# Function to interpolate the data with a univariate cubic spline<br />
if sfactorflag:<br />
f_interpolated = UnivariateSpline(x, y, k=3, s=sfactor)<br />
else:<br />
f_interpolated = UnivariateSpline(x, y, k=3)<br />
<br />
<br />
# Values of x for sampling inside the boundaries of the original data<br />
x_interpolated = np.linspace(x.min(),x.max(), nout)<br />
# New values of y for these sample points<br />
y_interpolated = f_interpolated(x_interpolated)<br />
<br />
<br />
# Create an plot with labeled axes<br />
plt.figure().canvas.set_window_title(infile)<br />
plt.xlabel('X')<br />
plt.ylabel('Y')<br />
plt.title('Interpolation')<br />
plt.plot(x, y, color='red', linestyle='None', marker='.', markersize=10., label='Data')<br />
plt.plot(x_interpolated, y_interpolated, color='blue', linestyle='-', marker='None', label='Interpolated', linewidth=1.5)<br />
plt.legend()<br />
plt.minorticks_on()<br />
plt.show()<br />
<br />
<br />
# Open the output file<br />
outfp = open(outfile, 'w')<br />
# Write the interpolated data<br />
for i in range(nout): <br />
outline = "%f %f\n" % (x[i],y[i])<br />
outfp.write(outline)<br />
# Close the output file<br />
outfp.close()<br />
<br />
# Exit gracefully<br />
exit()<br />
<br />
<br />
Aftet the fitting is done the program runs pyplot to display the results. The interactive window it opens and manages is a GUI, but it has been set up by the command line code. Of course there are many variations on command line interfacing, and the one shown here with coded argument parsing is perhaps the simplest and would serve as a template for most applications. Python offers other ways to manage the command line too. The os module is useful to have access to the operating system from within a Python routine. Some examples are<br />
<br />
import os<br />
<br />
os.chdir(path) changes the current working directory (CWD) to a new one<br />
os.getcdw() returns the CWD<br />
os.getenv(varname) returns the value of the environment variable varname<br />
<br />
and there are many more, providing within the Python program many of the command line operating system tools available on the system. Here's an example of how that might be used in a program that processes many files in a directory:<br />
<br />
#!/usr/bin/python<br />
<br />
# Process images in a directory tree<br />
<br />
import os<br />
import sys<br />
import fnmatch<br />
import string<br />
import subprocess<br />
import pyfits<br />
<br />
if len(sys.argv) != 2:<br />
print " "<br />
sys.exit("Usage: process_fits.py directory\n")<br />
<br />
toplevel = sys.argv[1]<br />
<br />
# Search for files with this extension<br />
pattern = '*.fits' <br />
<br />
for dirname, dirnames, filenames in os.walk(toplevel):<br />
for filename in fnmatch.filter(filenames, pattern):<br />
fullfilename = os.path.join(dirname, filename)<br />
<br />
try: <br />
<br />
# Open a fits image file<br />
hdulist = pyfits.open(fullfilename)<br />
<br />
except IOError: <br />
print 'Error opening ', fullfilename<br />
break <br />
<br />
# Do the work on the files here ...<br />
<br />
# You can call a separate system process outside of Python this way<br />
darkfile = 'dark.fits'<br />
infilename = filename<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_d.fits'<br />
subprocess.call(["/usr/local/bin/fits_dark.py", infilename, darkfile, outfilename]) <br />
<br />
exit()<br />
<br />
Here we used the os module's routines to walk through a directory tree, parse filenames, and then perform another operation on those files that is a separate command line Python program. Command line tools used to leverage the operating system's built-in functions can be very powerful, and take hours out of actually running a program on a large database.<br />
<br />
<br />
=== Graphical User Interface to Plotting ===<br />
<br />
First, read the [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphical_User_Interface_with_Python comprehensive section on Tkinter] to see how that code works, and then the one on [http://prancer.physics.louisville.edu/astrowiki/index.php/Graphics_with_Python graphics with Python] to learn the basics of the plotting toolkits. In this section we combine Tk for control with interactive graphics. Our goals are to<br />
<br />
* Retain the features of the graphics display with its interactivity and style<br />
* Use tkinter to offer the user access to new features such loading files and processing data<br />
* Allow real-time updating so that the plot can follow changing data<br />
<br />
To this end we will write a Python 3 program that uses tkinter and add matplotlib or bokeh to make useful tools that also serve as templates of your own development. The two resulting programs are almost identical except for the plotting functions, and you will find them on the [http://prancer.physics.louisville.edu/classes/650/python/examples/ examples page]. Look for "tk_plot.py" and "bokeh_plot.py".<br />
<br />
Before we begin, check that bokeh and tkinter are available in your version of Python 3. The version of Tk should be at least 8.6, which you can check with<br />
<br />
tkinter.TkVersion<br />
<br />
on the command line after importing tkinter. For bokeh, use<br />
<br />
bokeh.__version__<br />
<br />
that's with two underscores before and after the "version". Look for version 0.12.15 or greater to have the functionality described here.<br />
<br />
<br />
'''The Tk Framework'''<br />
<br />
We begin our code as usual by requiring these libraries<br />
<br />
<br />
import tkinter as tk<br />
from tkinter import ttk<br />
from tkinter import filedialog<br />
from tkinter import messagebox<br />
<br />
such that Tk functions require the "tk." and ttk functions use "ttk". We have also included file dialog and message widgets that were mentioned in the summary of Tk widgets.<br />
<br />
For connection to the operating system we need "os" and "sys", and for handling data we use numpy<br />
<br />
import os<br />
import sys<br />
import numpy as np<br />
<br />
There are global variables that are used to pass information from file handlers and processing to the graphics components<br />
<br />
global selected_files<br />
global x_data<br />
global y_data<br />
<br />
selected_files = []<br />
x_data = np.zeros(1024)<br />
y_data = np.zeros(1024)<br />
x_axis_label = ""<br />
y_axis_label = ""<br />
<br />
We will create a Tk window with button or other widgets that require call backs when they are activated. Since these programs are templates for what can be done, look at the examples to see how the call backs are structured. The one to read a data file illustrates how to use Python to parse a file and save its data in numpy arrays.<br />
<br />
def read_file(infile):<br />
global x_data<br />
global y_data<br />
<br />
datafp = open(infile, 'r')<br />
datatext = datafp.readlines()<br />
datafp.close()<br />
<br />
# How many lines were there?<br />
<br />
i = 0<br />
for line in datatext:<br />
i = i + 1<br />
<br />
nlines = i<br />
<br />
# Fill the arrays for fixed size is much faster than appending on the fly<br />
<br />
x_data = np.zeros((nlines))<br />
y_data = np.zeros((nlines))<br />
<br />
# Parse the lines into the data<br />
<br />
i = 0<br />
for line in datatext:<br />
<br />
# Test for a comment line<br />
<br />
if (line[0] == "#"):<br />
pass<br />
<br />
# Treat the case of a plain text comma separated entries <br />
<br />
try:<br />
<br />
entry = line.strip().split(",") <br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
<br />
i = i + 1 <br />
except: <br />
<br />
# Treat the case of space separated entries<br />
<br />
try:<br />
entry = line.strip().split()<br />
x_data[i] = float(entry[0])<br />
y_data[i] = float(entry[1])<br />
i = i + 1<br />
except:<br />
pass<br />
<br />
return()<br />
<br />
Notice how we allow for both comma separated and space delimited data. The expectation is that the file will have two values per line, the first one being "x" and the second one being "y". They may have white space between them, or be separated by a comma. Files written this way are very common, and easy to use too, but we may not know before reading one which style it was written in. Also common (in Grace, for example), a "#" at the beginning of a line indicates a comment and implies to ignore the entire line. The reader simply skips lines that begin with "#". A more advanced reader would validate the numbers as they come in to prevent errors later. This one simply assigns them to two global arrays, one for x and one for y, because that is the format required for plotting 2D data by both matplotlib and bokeh. Also, having the data in numpy offers the options of other processing based on the GUI.<br />
<br />
The file that is being read has been selected with a Tk widget that returns filenames in a global list <br />
<br />
def select_file():<br />
<br />
global selected_files<br />
<br />
# Use the tk file dialog to identify file(s)<br />
<br />
newfile = ""<br />
try:<br />
newfile, = tk.filedialog.askopenfilenames()<br />
selected_files.append(newfile)<br />
except:<br />
tk_info.set("No file selected")<br />
<br />
if newfile !="":<br />
tk_info.set("Latest file: "+newfile)<br />
<br />
return()<br />
<br />
By holding onto all the selections in s list, we retain the option of going back to them later. However here in the file selection call back, we take only the first file that the user selects to add to that list. Of course we take all of them and process the one by one. The Tk function will return leaving the selected_files list with its new entry as the last one on the list, and display its name on the user interface.<br />
<br />
<br />
<br />
'''Matplotlib from Tk on the Desktop'''<br />
<br />
<br />
For matplotlib we need <br />
<br />
import matplotlib as mpl<br />
import matplotlib.pyplot as plt<br />
mpl.use('TkAgg')<br />
<br />
The Plot button call back uses matplotlib with its pyplot namespace to create a plot on the matplotlib canvas. The plot is not embedded in the Tk user interface in order to invoke the matplotlib toolbar, which in version 2.2 is deprecated for Tk. This solution avoids that issue, but also means that it is not possible to update the content of the displayed data through the Tk interface.<br />
<br />
# Create the desired plot<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
# Create the desired plot with matplotlib<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot.<br />
plt.figure(nfiles)<br />
plt.plot(x_data, y_data, lw=3)<br />
plt.title(this_file)<br />
plt.xlabel(x_axis_label)<br />
plt.ylabel(y_axis_label)<br />
plt.show()<br />
<br />
<br />
Input is handled through global variables, and the axis labels may be assigned through the Tk interface, though in tk_plot.py that is left for the next version. <br />
<br />
<br />
[[File:Humidity_tk.png]]<br />
<br />
<br />
<br />
<br />
<br />
'''Bokeh from Tk in the Browser and on the Web'''<br />
<br />
We include the bokeh modules needed for a basic plot<br />
<br />
from bokeh.plotting import figure, output_file, show<br />
<br />
For bokeh the call back is very similar<br />
<br />
# Create the desired plot with bokeh<br />
<br />
def make_plot(event=None):<br />
<br />
global selected_files<br />
global x_axis_label<br />
global y_axis_label<br />
<br />
nfiles = len(selected_files)<br />
this_file = selected_files[nfiles-1]<br />
<br />
read_file(this_file)<br />
<br />
# Create the plot using bokeh<br />
<br />
this_file_basename = os.path.basename(this_file)<br />
base, ext = os.path.splitext(this_file_basename)<br />
bokeh_file = base+".html"<br />
<br />
output_file(bokeh_file)<br />
p = figure(tools="hover,crosshair,pan,wheel_zoom,box_zoom,box_select,reset") <br />
p.line(x_data, y_data, line_width=2)<br />
show(p)<br />
<br />
<br />
The tools are explicitly requested, unlike matplotlib which provides a tool bar that is fully populated.<br />
<br />
[[File:Humidity_bokeh.png]]<br />
<br />
<br />
<br />
<br />
<br />
=== Running a Server for Javascript in a Browser Engine ===<br />
<br />
Python includes packages that enable a simple webserver which may be used to run advanced graphics operations through javascript within a browser's javascript engine. We will cover use of javascript, and Three.js in particular, as a supplement or replacement for 3D visualization in Python. In order to do this without the burden of managing a full Apache installation, we turn to Python. This shell script in Linux will start a web server in the directory that the script is running in:<br />
<br />
python -m CGIHTTPServer 8000 1>/dev/null 2>/dev/null &<br />
echo "Use localhost:8000"<br />
echo<br />
<br />
By using port 8000 the server is distinct from the one on port 80 used for web applications. The site would appear by putting <br />
<br />
http://localhost:8000<br />
<br />
in a Google Chrome or Mozilla Firefox browser window running on the same user account on the same machine. Note the redirects for stdio and stderr to /dev/null keeps output from appearing in the console. The server may be killed by identifying its process ID in Linux with the command<br />
<br />
ps -e | grep python<br />
<br />
followed by <br />
<br />
kill -s 9 pid<br />
<br />
where "pid" is the ID number found in the first line. Alternatively, if it is the only python process running you may kill it with<br />
<br />
killall python<br />
<br />
Any file in the directory tree below the starting directory is now accessible in the browser, and html files will be parsed to run the included javascript. If here is a cgi-bin directory at the top level, the server will see it and use it. One use of this low level server is to create a virtual instrument that is accessible from the web, but not exposed to it directly. A remote web server on the same network that can access port 8000 on the instrument machine can run code and get response from the instrument by calling cgi-bin operations. <br />
<br />
For programmers, however, this utility allows development and debugging of web software without the need for a large server.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Image_processing_with_Python_and_SciPy&diff=2541Image processing with Python and SciPy2018-04-14T17:54:48Z<p>WikiSysop: </p>
<hr />
<div>Given that NumPy provides multidimensional arrays, and that there is core support through the Python Imaging Library and Matplotlib to display images and manipulate images in the Python environment, it's easy to take the next step and combine these for scientific image processing. As part of our [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy short course on Python for Physics and Astronomy] we begin by exploring how Python handles image input and output through pillow, scikit-image, and pyfits. Once loaded, an image may be processed using library routines or by mathematical operations that would take advantage of the speed and conciseness of numpy and scipy. Some of the resources mentioned here require Python >3.4, and at this time Python 3.6 is the current one.<br />
<br />
<br />
== Pillow - An Imaging Library ==<br />
<br />
The Python Imaging Library (PIL) was developed for Python 2.x and provided functions to manipulate images, including reading, modifying and saving in various standard image formats in a package called "PIL". With the coming of age of Python 3.x, a fork of the older version has evolved that is more suited for the new technologies and is in a package called "Pillow". It continues to improve, and the features described here are tested with "Pillow 5.1" and Python 3.6 as of April 2018. Pillow will probably be on any packaged distribution of Python 3, or it may be installed with (note the capital "P")<br />
<br />
pip install Pillow<br />
<br />
Pillow includes the basics of image processing. with functions that are documented by the developers in a [https://pillow.readthedocs.io/en/5.1.x/handbook/index.html handbook] describing the methods and giving some examples. Pillow uses the same "namespace" as PIL and older code should work, perhaps with a few modifications to allow for recent developments. Most import for us, Pillow has routines to read and write conventional image formats. Once an image has been read into a numpy array, the full power of Python is available to process it, and we can turn to Pillow again to save a processed image in png or jpg or another format. Flexibile Image Transport System (FITS) files used for astronomy should be managed with astropy or pyfits.<br />
<br />
As a simple starting example, suppose you have an image that was taken with the camera turned so that "up" is to the side when the image is displayed. Here's how you would read it, rotate it 90 degrees, and write it out again using Pillow.<br />
<br />
import os <br />
from PIL import Image as pil<br />
<br />
parser= argparse.ArgumentParser(description = 'Rotate a png image 90 degrees')<br />
<br />
if len(sys.argv) == 1:<br />
sys.exit("Usage: png_image_rotate file.png ")<br />
exit()<br />
elif len(sys.argv) == 2:<br />
infilename = sys.argv[1]<br />
else:<br />
sys.exit("Usage: png_image_rotate file.png ")<br />
exit() <br />
<br />
myimage = pil.open(infilename)<br />
<br />
mirror = myimage.transpose(pil.ROTATE_90)<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_r90.png'<br />
mirror.save(outfilename)<br />
<br />
<br />
The first part of this is standard form to get the image name on the command line and make it available to the program. The PIL is imported with Image, and appears in the code as<br />
"pil". This is an amazingly short program, because in opening the image the library handles all the conversions in formatting and stores the image internally so that you refer to it only by the name assigned when it is loaded. We operate on the image with the transpose function, which has an argument that controls what it does. Here we rotate the image 90 degrees, and then save it to a file with a new name. The saving operation converts the internal data back to the file format set by the extension used in the file name.<br />
<br />
You can transpose an image left-right with<br />
<br />
mirror = myimage.transpose(pil.FLIP_LEFT_RIGHT)<br />
<br />
or do both in one step with<br />
<br />
mirror = myimage.transpose(Image.FLIP_LEFT_RIGHT).transpose(pil.ROTATE_90)<br />
<br />
Processing is not limited to "PNG" files, though that file type is preferred because it is not a lossy storage option. Python reads and writes "jpg" files too. While PIL provides some essential functionality, for more practical uses in astronomy we need to read Flexible Image Transport or "FITS" files, and to enable numerical work on images in SciPy. Many of the processing functions you will find in Python Imaging Library (PIL) are also available in SciPy where we have precise mathematical control over their definitions and operation. Some more advanced techiques are available in SciPy too, courtesy of researchers who have contributed to SciKit as we will see.<br />
<br />
Python's core routines dependent on matplotlib may be used to display an image, but these are designed for graphics, and limited by the constraints of the matplotlib interface. With a little effort there are better choices.<br />
<br />
<br />
== SciKit Image ==<br />
<br />
We've mentioned that [https://www.scipy.org/scikits.html SciKits] is a searchable index of highly specialized tools that are built on SciPy and NumPy. Among them, [http://scikit-image.org/ scikit-image] is for image processing in Python. It is oriented toward extracting physical information from images, and has routines for reading, writing, and modifying images that are powerful, and fast. Scikit-image is often compared to OpenCV, a collection of programs for computer vision that include live video. Both are actively maintained and in many ways complementary, but for physics and astronomy scikit-image is more powerful at this time.<br />
<br />
The scikit-image package is part of Anaconda and Enthought Python, and that would be a recommended platform for Windows. However for Linux and Mac OSX, <br />
<br />
pip install scikit-image<br />
<br />
should work. The package usually requires local compilation of the code when installed this way. Qt bindings are provided by PyQt5, PySide or by qtpy, depending on licensing and software requirements. PySide is less restrictive than PyQt and qtpy is an abstraction that draws on both. <br />
<br />
pip install PySide<br />
<br />
<br />
While scipy has included an image reader and writer, as of April 2018 this function is deprecated in the base code and rather than use pillow, we can turn to scikit-image. The module to read and write image is skimage.io<br />
<br />
import skimage.io<br />
import numpy as np<br />
<br />
and the command<br />
<br />
skimage.io.find_available_plugins()<br />
<br />
will provide a dictionary of the libraries that may be used to read various file types. For example<br />
<br />
imlibs = skimage.io.find_available_plugins()<br />
<br />
and imlibs['pil'] will list the functions that the Python imaging library provides. The package tries the libraries in order until it finds one that works:<br />
<br />
myimage = skimage.io.imread(filename).astype( np.float32)<br />
<br />
will read an image and return a numpy array which by default will be an RGB image if the file is a png file, for example. A greyscale image image be specified by including as_grey=True as an argument. A numpy image has a shape that for color has 3 values in each pixel<br />
<br />
print(myimage.shape)<br />
(498, 680, 3)<br />
<br />
as an example for an RGB image, or <br />
<br />
(498,680)<br />
<br />
for a gray scale image. Since numpy by default would store into a 64-bit float and matplotlib (the default display for skimage) requires 32-bit, we specify loading into a 32 bit array while planning ahead to seeing the result.<br />
<br />
<br />
Images may be saved:<br />
<br />
skimage.io.imsave(filename, nparray)<br />
<br />
and the file type is determined by file name extension. <br />
<br />
Images may be displayed, but it takes two steps<br />
<br />
skimage.io.imshow(myimage)<br />
skimage.io.show()<br />
<br />
when invoking the default matplotlib plugin. The display will look like one created by pyplot. There is a simpler, viewer module too, without pyplot toolbar.<br />
<br />
import skimage.viewer<br />
<br />
viewer = skimage.viewer.viewers.ImageViewer(myimage)<br />
viewer.show<br />
<br />
<br />
We will use routines from the scikit-image package to identify stars in an image, adapting a program given by Eli Bressert in his book SciPy and NumPy to handle an image from one of our telescopes. The complete program is given in the Examples section below.<br />
<br />
It begins with the requisite imports including the ones from SciKit <br />
<br />
import os<br />
import sys<br />
import argparse<br />
import matplotlib.pyplot as mpl<br />
import numpy as np<br />
import astropy.io.fits as pyfits<br />
import skimage.morphology as morph<br />
import skimage.exposure as skie<br />
<br />
It opens an FITS file and loads only part of the image<br />
<br />
img = pyfits.getdata(infits)[1000:3000, 1000:3000]<br />
<br />
to insure a square sample area and to limit the size of the search region. This is an alternative to reading a FITS image and also getting the header.<br />
<br />
With this data the program creates a new image using an mapping arc sinh that captures the full dynamic range effectively. It locates lower and upper bounds that should include only stars. The parameters would probably have to be refined to optimize the extraction of stars from background.<br />
<br />
limg = np.arcsinh(img)<br />
limg = limg / limg.max()<br />
low = np.percentile(limg, 0.25)<br />
high = np.percentile(limg, 99.5)<br />
opt_img = skie.exposure.rescale_intensity(limg, in_range=(low,high))<br />
<br />
With the useful range determined, we create a new image that is scaled between the lower and upper limits that will be used for displaying the star map. We search the arcsinh-stretched original image for local maxima and catalog those brighter than a threshold that is adjusted based on the image.<br />
<br />
lm = morph.is_local_maximum(limg)<br />
x1, y1 = np.where(lm.T == True)<br />
v = limg[(y1,x1)]<br />
lim = 0.7<br />
x2, y2 = x1[v > lim], y1[v > lim]<br />
<br />
The list x2,y2 has the stars the were found. The rest is image display code to draw circles around the stars and create an image that shows where they are.<br />
<br />
<br />
[[File:m34_100s_log1000_rgb_20121104.png | center | 400px]]<br />
<br />
When the program processes a 100 second image of the open cluster M34, shown above, it identifies the stars in the right panel below.<br />
<br />
[[File:test_m34_stars.png | center | 600px]]<br />
<br />
== Images with NumPy and SciPy ==<br />
<br />
SciPy can read jpg and png images directly, without using PIL. With SciPy images are stored in numpy arrays, and we have direct access to the data for uses other than visualization.<br />
<br />
import numpy as np<br />
import matplotlib.pyplot as plt<br />
import skimage.io.imread as imread<br />
import skimage.io.imsave as imsave<br />
#import skimage.io.imshow as imshow is an alternative for display<br />
<br />
image_data = imread('test.jpg').astype(np.float32)<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape<br />
<br />
scaled_image_data = image_data / 255.<br />
<br />
# Save the modified image if you want to<br />
# imsave('test_out.png', scaled_image_data)<br />
<br />
plt.imshow(scaled_image_data)<br />
plt.show()<br />
<br />
exit()<br />
<br />
For our 512x512 color test image, this returns<br />
<br />
Size: 786432<br />
Shape: (512, 512, 3)<br />
<br />
because the image is 512x512 pixels and has 3 planes -- red, green, and blue. When SciPy reads a jpg or png image it will separate the colors for you. The "image" is a data cube. In the imread line we control the data type within the Python environment. Of course the initial data is typically 8 bits for color images from a cell phone camera, 16 bits for scientific images from a CCD, and perhaps 32 bits for processed images that require the additional dynamic range.<br />
<br />
We can display images with matplotlib.pyplot using [http://matplotlib.org/api/pyplot_api.html#matplotlib.pyplot.imshow imshow()] --<br />
<br />
imshow(X, cmap=None, norm=None, aspect=None, interpolation=None,<br />
alpha=None, vmin=None, vmax=None, origin=None, extent=None, **kwargs)<br />
<br />
where "X" is the image array. If X is 3-dimensional, imshow will display a color image.<br />
Matplotlib has a [http://matplotlib.org/users/image_tutorial.html tutorial] on how to manage images. Here, we linearly scale the image data because for floating point imshow requires values between 0. and 1., and we know beforehand that the image is 8-bits with maximum values of 255. <br />
<br />
Here's what a single test image displayed from this program looks like in Python.<br />
<br />
<br />
[[File:Single_image.png | center | 600px]]<br />
<br />
<br />
We can slice the image data to see each color plane by creating new arrays indexed from the original one. <br />
<br />
import numpy as np<br />
from scipy.misc import imread, imsave<br />
import pylab as plt<br />
<br />
image_data = imread('test.jpg').astype(np.float32)<br />
scaled_image_data = image_data / 255.<br />
<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape<br />
<br />
image_slice_red = scaled_image_data[:,:,0]<br />
image_slice_green = scaled_image_data[:,:,1]<br />
image_slice_blue = scaled_image_data[:,:,2]<br />
<br />
print 'Size: ', image_slice_0.size<br />
print 'Shape: ', image_slice_0.shape<br />
<br />
plt.subplot(221)<br />
plt.imshow(image_slice_red, cmap=plt.cm.Reds_r)<br />
<br />
plt.subplot(222)<br />
plt.imshow(image_slice_green, cmap=plt.cm.Greens_r')<br />
<br />
plt.subplot(223)<br />
plt.imshow(image_slice_blue, cmap=plt.cm.Blues_r) <br />
<br />
plt.subplot(224)<br />
plt.imshow(scaled_image_data) <br />
<br />
plt.show()<br />
<br />
<br />
For a colorful image you will see the differences between each slice --<br />
<br />
[[File:single_image_slice.png | center | 600px ]]<br />
<br />
<br />
This approach offers a template for displaying multidimensional computed or experimental data as an image created with Python. Consider this short program that creates and displays an image with Gaussian noise:<br />
<br />
# Import the packages you need<br />
import numpy as np<br />
import matplotlib.pyplot as plt<br />
from scipy.misc import imsave<br />
<br />
# Create the image data<br />
image_data = np.zeros(512*512, dtype=np.float32).reshape(512,512)<br />
random_data = np.random.randn(512,512)<br />
image_data = image_data + 100.*random_data<br />
<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape <br />
scaled_image_data = image_data / 255.<br />
<br />
# Save and display the image <br />
imsave('noise.png', scaled_image_data)<br />
plt.imshow(scaled_image_data, cmap='gray')<br />
plt.show()<br />
<br />
exit()<br />
<br />
<br />
Instead of random noise, you could create any functional data you wanted to display. For images with color, the NumPy array would have red, green, and blue planes.<br />
<br />
<br />
== Astronomical FITS Files == <br />
<br />
Astronomical image data are potentially complex and rich, for which quantitative structures have been developed to standardize lossless storage of the data along with the metadata that describe its origin and previous processing. While photographic images are often only 8 bits deep, mixed with red, green and blue in a single image, and compressed to reduce file size, astronomical images are 16 or 32 bits deep in a single color. Until recently, the file sizes needed for astronomy were unrivaled by commodity cameras, but in todays market of megapixel imaging on cell phones, the camera in your pocket also produces very large rich images. The difference in handling them is that for science we want to preserve the data without loss, quantitatively calibrate and measure the flux from the source and map that back to a specific angle in space, while for art or even for some academic uses, it is the beauty, color, and highlights shown in the image display that are important. Commodity images are saved usually in compressed formats such as JPG, or uncompressed TIFF or proprietary binary formats. For astronomy and other quantitative imaging work, the [https://fits.gsfc.nasa.gov/fits_primer.html Flexible Image Transport System (or FITS)] format is almost universal. It includes the image data, and a header describing the data. FITS files may also be tables of data, or a cube of images in sequence. The standards developed for creating these files are slowing evolving as the needs of big data in astronomy have grown. Programmers and scientists at NASA, the Space Telescope Science Institute, and the academic community at large are contributing to libraries that enable reading, processing, and saving FITS files in Python, as well as in C, and Fortran. <br />
<br />
Once a FITS file has been read, the header its accessible as a Python dictionary of the data contents, and the image data are in a NumPy array. With Python using NumPy and SciPy you can read, extract information, modify, display, create and save image data. <br />
<br />
<br />
=== Reading and Writing a FITS File in Python ===<br />
<br />
There are many image display tools for astronomy, and perhaps the most widely used is [http://hea-www.harvard.edu/RD/ds9/site/Home.html ds9] which is available for Linux, MacOS, and Windows, as well as in source code. It is always useful to have a version on your computer when you are working with FITS images. A more versatile Java platform for astronomical image viewing that also does and processing' is now widely used for precision astronomical photometry where interactive analysis is needed. AstroImageJ is free and simple to install on most computers, and because it is also a powerful processor, for many purposes it is an all-in-one tool. If you are working with FITS images, this is highly recommended: <br />
<br />
[http://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ] website and program source<br />
<br />
[http://iopscience.iop.org/article/10.3847/1538-3881/153/2/77 Astronomical Journal] article describing Astroimagej<br />
<br />
<br />
However, sometimes we want perform specialized work on image data, and to view it while processing it in Python. This routine demonstrates how to read a FITS file, inspect its header, and show the image on the computer display. It is dependent on the "PyFITS" library developed at Space Telescope Science Institute and incorporated into a larger package by the [http://www.astropy.org/ AstroPy Project]. AstroPy's library is part of the Enthought and Canopy distributions of Python. If you are using a system version of python, you may need to install it with pip<br />
<br />
pip install astropy<br />
<br />
should do it. The package have several dependences on python (it requires version 3.5 or higher at this time (April 2018), and on numpy and scipy. It is also frequently updated, and while stable may push the "cutting edge" of distributions from conservative operating system distributors like OpenSuse. That's a good reason, if you need this capability, to have a version of Python built with current sources, or to use a complete distribution such as Anaconda or Enthought.<br />
<br />
A program to work with FITS files would begin by importing the packages that it needs<br />
<br />
<br />
import os<br />
import sys<br />
import argparse<br />
import numpy as np<br />
import astropy.io.fits as pyfits<br />
import matplotlib <br />
import matplotlib.pyplot<br />
<br />
Importing the FITS modules in this way makes the code backward compatible with the earlier versions of PyFITS. Also, since sometimes submodules are not loaded with the larger package, we explicitly ask for the io.fits components, and for pyplot.<br />
<br />
# Define a function for making a linear gray scale<br />
def lingray(x, a=None, b=None):<br />
"""<br />
Auxiliary function that specifies the linear gray scale.<br />
a and b are the cutoffs : if not specified, min and max are used<br />
"""<br />
if a == None:<br />
a = np.min(x)<br />
if b == None:<br />
b = np.max(x)<br />
return 255.0 * (x-float(a))/(b-a)<br />
<br />
# Define a function for making a logarithmic gray scale<br />
def loggray(x, a=None, b=None):<br />
"""<br />
Auxiliary function that specifies the logarithmic gray scale.<br />
a and b are the cutoffs : if not specified, min and max are used<br />
"""<br />
if a == None:<br />
a = np.min(x)<br />
if b == None:<br />
b = np.max(x) <br />
linval = 10.0 + 990.0 * (x-float(a))/(b-a)<br />
return (np.log10(linval)-1.0)*0.5 * 255.0<br />
<br />
These functions may be used to rescale the data for display. Scaling can be done with a colormap, or by modifying the data before displaying. Here, we modify the data and save it in a temporary buffer for display. In a more elaborate program with a full user interface, this scaling could be done interactively.<br />
<br />
# Provide information to the argparse routine if we need it<br />
parser= argparse.ArgumentParser(description = 'Display a fits image')<br />
<br />
The argparse function offers more flexibility than the system routines, though here we only include this for future additions. A simpler method is to parse the command line itself using the system utilities:<br />
<br />
# Test for command line arguments<br />
if len(sys.argv) == 1:<br />
sys.exit("Usage: display_fits infile.fits ")<br />
exit()<br />
elif len(sys.argv) == 2:<br />
infits = sys.argv[1]<br />
else:<br />
sys.exit("Usage: display_fits infile.fits ")<br />
exit() <br />
<br />
# Open the fits file and create an hdulist<br />
inhdulist = pyfits.open(infits) <br />
<br />
# Assign the input header in case it is needed later<br />
inhdr = inhdulist[0].header<br />
<br />
# Assign image data to a numpy array<br />
image_data = inhdulist[0].data<br />
<br />
The header and data are now available. We'll look at header information later. For now, all we need are the values in the numpy data array. It will be indexed from [0,0] at the upper left of the data space, which would be the upper left of the displayed image.<br />
<br />
# Print information about the image<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape<br />
<br />
For this example we use linear scaling.<br />
<br />
# Show the image<br />
new_image_data = lingray(image_data)<br />
new_image_min = 0.<br />
new_image_max = np.max(new_image_data)<br />
matplotlib.pyplot.imshow(new_image_data, vmin = new_image_min, vmax = new_image_max, cmap ='gray') <br />
matplotlib.pyplot.show() <br />
<br />
# Close the input image file and exit<br />
inhdulist.close()<br />
exit ()<br />
<br />
It would be a straightforward exercise to add an interactive scale control and to read out the value of each pixel in this program. With that, it has the basic functionality of ds9 or the AstroImageJ viewer.<br />
<br />
<br />
=== Correcting and Combining Images ===<br />
<br />
There are processing operations done on all "raw" astronomical images taken with [http://prancer.physics.louisville.edu/astrowiki/index.php/Use_a_CCD_Camera charge coupled device cameras]. Since the voltage that is digitized is proprotional to the number of photons that arrived at each pixel during the exposure, the data for that pixel should be proportional to the photon count, that is to the irradiance of photons/area-time times the area of the pixel times the exposure time. An absolute conversion to the flux from the sources that are imaged requires correcting for<br />
<br />
*Signal at each pixel with no light present -- the "dark" image<br />
*Signal at each pixel for the same irradiance/pixel -- the "flat" field<br />
*Non-linear responses<br />
*Absolute calibration to an energy or photon flux based on spectral response<br />
<br />
We can do all of these in NumPy using its built-in array math. By taking an image with no light and subtracting it from an image, we have corrected for the dark response (as well as for electronic "bias"). Or, if needed, we can scale a dark image taken at a different exposure time from the image we are measuring, and then subtract that. We can divide an image by a reference flat to correct for pixel-to-pixel variations in response, for vignetting in an optical system (the non-linear fall-off of transmission across the field of view). We can scale images non-linearly to correct for non-linear amplifier responses and saturation or charge loss from each pixel.<br />
<br />
For dark subtraction we take the difference of two images:<br />
<br />
dark_corrected_image = raw_image - reference_dark_image<br />
<br />
and for flat field correction we divide<br />
<br />
final_image = dark_corrected_image / reference_flat_image<br />
<br />
In NumPy there is no array indexing needed, and the operations are one-liners. Similarly, a non-linear second order correction or a scaling to physical units may be done on the entire array with<br />
<br />
corrected_image = a * (final_image) + b * (final_image**2) <br />
<br />
The reference dark and flat images must be obtained beforehand. For example, a reference dark image may be a median average of many images taken with the same exposure time as the science image, but with the shutter closed. To perform the median operation on the arrays rather than sequentially on the elements, we stack all of the original individual dark images to make a 3-d stack of 2-d arrays. Using numpy arrays we would have<br />
<br />
dark_stack = np.array([dark_1, dark_2, dark_3])<br />
<br />
where dark_1, dark_2, and dark_3 are the original dark images. We need at least 3, or any odd number in the dark stack. If the images are m rows of n columns, and if we have k images in the stack, the stack will have a shape (k,n,m): k images, each of n rows, each of m columns. A median on the first axis of this stack returns the median value for each pixel in the stack --<br />
<br />
dark_median = np.median(dark_stack, axis=0)<br />
<br />
and has a shape that is (n,m) with the elements that we wanted. <br />
<br />
Median operations on a image stack remove random noise more effectively than averaging because one source of noise in CCD images is cosmic ray events that produce an occasional large signal at a pixel. If we mean-averaged with an outlier in one pixel the result would be too large because of the one singular event in the stack, but by median-averaging we discard that event without adversely affecting the new reference frame.<br />
<br />
A median of a stack of flat frames, all normalized so that they should be identical, will remove stars from the reference image as long as each contributing flat image in the stack is taken of a different star field. Typically, these "sky flats" are images taken at twilight, processed to remove the dark signal, normalized to unity, and then median averaged to remove stars and reduce random noise. <br />
<br />
Fortunately, normalizing an image is very simple because<br />
<br />
image_mean = np.mean(image_data)<br />
<br />
returns the mean value of all elements in the array. Divide an image by its mean to create a normalized image with unity mean<br />
<br />
normalized_image = image_data / image_mean<br />
<br />
<br />
As long as the images have the same size you can sum them by simple addition<br />
<br />
sum_image = image1 + image2 + image3 ...<br />
<br />
We would do this to create a final image that is effectively one long exposure, the sum of all the contributing image exposure times. Because of guiding errors, cosmic rays, and weather, one very long exposure is often not possible, but 10's or 100's of shorter exposures can be "co-added" after selecting the best ones and aligning them so that each pixel in the contributing image corresponds to the same place in the sky.<br />
<br />
<br />
=== Masked Image Operations ===<br />
<br />
A [http://docs.scipy.org/doc/numpy/reference/maskedarray.generic.html Numpy array mask] is a boolean array that determines whether or not an operation is to be performed. If you have an image in a array, the mask allows you to work on only part of the image, ignoring the other part. This is useful for finding the mean of a selected region, or for computing a function that fits part of an image but ignores another part. For example, consider an array<br />
<br />
x = np.arange(4*4).reshape(4,4)<br />
<br />
which is a simple 4x4 image with values from 0 to 15:<br />
<br />
print x<br />
array([[ 0, 1, 2, 3],<br />
[ 4, 5, 6, 7],<br />
[ 8, 9, 10, 11],<br />
[12, 13, 14, 15]])<br />
<br />
Make a copy of ths image which is a boolean mask<br />
<br />
xmask = np.ma.make_mask(x,copy=True,shrink=True,dtype=np.bool)<br />
<br />
and it will have all True values except for the first entry, 0, which is interpreted False:<br />
<br />
print xmask<br />
array([[False, True, True, True],<br />
[ True, True, True, True],<br />
[ True, True, True, True],<br />
[ True, True, True, True]], dtype=bool)<br />
<br />
You can set all values (or individual ones) to the state you need<br />
<br />
xmask[:,:] = True<br />
<br />
will make them all True, or <br />
<br />
xmask[0,:] = False<br />
xmask[3,:] = False<br />
xmask[:,0] = False<br />
xmask[:,3] = False<br />
<br />
will set the values around the perimeter False and leave the others True<br />
<br />
print xmask<br />
array([[False, False, False, False],<br />
[False, True, True, False],<br />
[False, True, True, False],<br />
[False, False, False, False]], dtype=bool)<br />
<br />
<br />
We apply the mask to the original data<br />
<br />
mx = np.ma.masked_array(x, mask=xmask)<br />
print mx<br />
<br />
[[0 1 2 3]<br />
[4 -- -- 7]<br />
[8 -- -- 11]<br />
[12 13 14 15]]<br />
<br />
and see that the values that are masked by "True" are not included in the new masked array. If we want the sum of the elements in the masked array, then <br />
<br />
np.sum(mx)<br />
90<br />
<br />
while<br />
<br />
np.sum(x)<br />
120<br />
<br />
There are many ways to create a mask, and to operate on masked arrays, described in the [http://docs.scipy.org/doc/numpy/reference/maskedarray.generic.html Numpy documentation].<br />
<br />
<br />
=== FITS Headers ===<br />
<br />
FITS files contain a "header" that tells us what is in the file, and then data in a format that is defined by the header. As we have seen, when a FITS file is read in NumPy with PyFITS, the data and the header are separate entities. Here's the complete header from an image taken with one of our telescopes:<br />
<br />
<br />
<br />
The first entries tell us it is a simple image file, 4096x4096 pixels (16 megapixels) written with 16 integer data bits per pixel. The other entries provide information about the image data. Therefore in dealing with FITS data we may need to change the first entries if the file is modified, and append new entries that annotate what has been done, or add to the archival notes.<br />
<br />
We will open an image ''v1701_00146_i.fits'' in interactive Python and look at the header.<br />
<br />
import numpy as np<br />
import astropy.io.fits as pyfits<br />
infits='v1701_00146_i.fits'<br />
inhdulist = pyfits.open(infits)<br />
inhdr = inhdulist[0].header<br />
inhdr<br />
<br />
<br />
SIMPLE = T / file does conform to FITS standard<br />
BITPIX = 16 / number of bits per data pixel<br />
NAXIS = 2 / number of data axes<br />
NAXIS1 = 4096 / length of data axis 1<br />
NAXIS2 = 4096 / length of data axis 2<br />
EXTEND = T / FITS dataset may contain extensions<br />
COMMENT FITS (Flexible Image Transport System) format is defined in 'Astronomy<br />
COMMENT and Astrophysics', volume 376, page 359; bibcode: 2001A&A...376..359H<br />
BZERO = 32768 / offset data range to that of unsigned short<br />
BSCALE = 1 / default scaling factor<br />
EXPTIME = 100. / exposure time (seconds)<br />
DATE-OBS= '2013-01-19T04:18:09.140' / date of observation (UT)<br />
IMAGETYP= 'Light Frame' / image type<br />
TARGET = 'V1701 ' / target<br />
INSTRUME= 'griz ' / instrument<br />
CCD-TEMP= -9.921 / temperature (C)<br />
FILTER = ' (3) i (700-825)' / filter<br />
TELESCOP= 'CDK20N ' / telescope<br />
DATE = '2013-01-19T04:20:08' / file creation date (YYYY-MM-DDThh:mm:ss UT)<br />
<br />
The first entries tell us it is a simple image file, 4096x4096 pixels (16 megapixels) written with 16 integer data bits per pixel. The other entries provide information about the image data. Therefore in dealing with FITS data we may need to change the first entries if the file is modified, and append new entries that annotate what has been done, or add to the archival notes.<br />
<br />
The header in pyfits is a Python dictionary. If you want to know the exposure time, ask<br />
<br />
inhdr['EXPTIME']<br />
<br />
and Python responds<br />
<br />
100.<br />
<br />
or <br />
<br />
inhdr['DATE-OBS']<br />
<br />
gets<br />
<br />
'2013-01-19T04:18:09.140'<br />
<br />
This means that you can sort through the contents of headers in a Python program to find the exposures you need, identify the filters used, and see what processing has been done. Most of the KEYWORDS shown above are standard, and those that are not can be easily added to specialized Python code.<br />
<br />
When a new FITS image is written with pyfits it contains only the bare necessities in the header -- the data type, some reference values for zero and scaling if needed, the size of the array. However you can copy items from the header of an image into images you create, and annotate the headers of your work to maintain a record of what has been done.<br />
<br />
An output FITS image file is created in steps from a NumPy data array outimage with<br />
<br />
outhdu = pyfits.PrimaryHDU(outimage)<br />
<br />
which encapsulates an image in an "HDU" object. The line<br />
<br />
outhdulist = pyfits.HDUList([outhdu])<br />
<br />
creates a list that contains the primary HDU which will have a default header<br />
<br />
outhdr = outhdulist[0].header<br />
<br />
Now you can append to this header or modify it<br />
<br />
history = 'This is what I did to the file.'<br />
outhdr.append(('HISTORY',history))<br />
more_history = 'I did this today.'<br />
outhdr.append(('HISTORY',more_history))<br />
<br />
since it is only a standard Python dictionary.<br />
<br />
<br />
For astronomical images headers may also include information that maps the image to the sky, a World Coordinate System or WCS header within the FITS header. These keywords apply only if the spatial mapping of the image is unchanged, and in processing that shifts or distorts images the WCS header should not be copied. Setting the WCS header and interpreting it to obtain the celestial coordinates requires -- what else -- PyWCS also developed at the Space Telescope Institute and now included in AstroPy. <br />
<br />
For example, to access the world coordinate system in a fits file, we would import the module<br />
<br />
from astropy.wcs import WCS<br />
<br />
to have the same namespace as the original PyWCS and library functions for conversion to and from celestial coordinates and pixel coordinates in the image. There are many examples of this in a package of utilities we have developed here<br />
<br />
[http://www.astro.louisville.edu/software/alsvid/index.html Alsvid] Algorithms for Visualization and Processing of Image Data<br />
<br />
<br />
== Other Processing ==<br />
<br />
We have seen that there are many useful basic operations for image processing available simply through NumPy and PyFITS. SciPy adds several others in the [http://docs.scipy.org/doc/scipy/reference/ndimage.html ndimage] package. The functions include image convolution, various averaging or filtering algorithms, Fourier processing, image interpolation, and image rotation.<br />
<br />
<br />
Since images are stored as arrays, there are some simple one-line ways to modify them. Flipping an image top to bottom or left to right is done with<br />
<br />
import numpy as np<br />
np.flipud(image_data)<br />
np.fliplr(image_data)<br />
<br />
While we also have to add lines to read the file, update the header, and write it out again, the program to preform these operations is remarkably short. A template program is given in the examples below.<br />
<br />
Rotating it by 90 degrees is also easy<br />
<br />
np.rot90(image_data)<br />
<br />
Rotated and flipped images may be saved either as FITS image (using PyFITS) or as png or jpg images using <br />
<br />
from scipy.misc import imsave<br />
imsave(filename,rotated_image_data)<br />
<br />
Image rotation by other angles is somewhat more complex since it requires tranforming an image in a way that does not preserve its shape, or even the number of elements. <br />
<br />
There is a simple one-line command that does all this for you<br />
<br />
from scipy.misc import imrotate<br />
angle = 22.5<br />
new_image = imrotate(image_data, angle, interp='bilinear')<br />
<br />
The angle is in degrees. The ''interp'' string determines how the interpolation will be done, with self-explanatory options<br />
<br />
*'nearest' <br />
*'bilinear' <br />
*'cubic' <br />
*'bicubic'<br />
<br />
Finally, for spectroscopy the useful data in an image may be a sum over a region. In that case we could index through the array and explicitly create the sum if needed, but in the ideal case of a spectrum we may want only the sum along a column for each element of a row. For that, the function is<br />
<br />
spectrum = np.sum(image, axis)<br />
<br />
which returns a numpy array that is the sum along the specified axis. If there are regions in the image that should not be included in the sum, then the image could be masked before computing the sum. The result is a 1-d array in which each element is the signal at a different wavelength. <br />
<br />
Sometimes the spectral lines are not along a column but are still straight (if curved, then we need other ways to mask the array), in which case scipy.misc.imrotate can be used before the sum to align the spectrum with a row or column.<br />
<br />
<br />
<br />
<br />
== AstroImageJ and Alsvid ==<br />
<br />
We have developed a collection of Python routines to do many of the routine astronomical image processing tasks such as dark subtraction, flat fielding, co-addition, and FITS header management through PyFITS and PyWCS. The current version of the "Alsvid" package is available for download:<br />
<br />
<br />
[http://www.astro.louisville.edu/software/alsvid/index.html Alsvid]<br />
<br />
<br />
<br />
== Examples ==<br />
<br />
For examples of Python illustrating image processing, see the [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples examples] section.<br />
<br />
<br />
<br />
<br />
Alsvid is intended as a command line supplement to the powerful Java program [http://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ] which provides real-time interactivity with astronomical image processing and precision photometry. AstroImageJ is built on the orirignal [https://imagej.nih.gov/ij/ ImageJ], an image processing program developed at the National Institutes of Health and now maintained as public domain opensource resource. As such, this core component of AIJ offers many specialized tools for image analysis in the biological sciences which are equally useful in Physics and Astronomy. AstroImageJ along side versatile Python desktop processing is a powerful combination for astronomical image analysis.<br />
<br />
<br />
== Assignments ==<br />
<br />
For the assigned homework to use these ideas, see the [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments assignments] section.</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Very_simple_Python&diff=2540Very simple Python2018-04-14T17:36:49Z<p>WikiSysop: </p>
<hr />
<div>In this section of our [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy short course on Python for Physics and Astronomy] we take a short path to using Python easily.<br />
<br />
<br />
<br />
== Installing Python on your computer ==<br />
<br />
Python is open source software available for free from [http://www.python.org/ www.python.org]. Version 2.7 is the aging mature version that is widely supported by other add-on modules. Python 3 is more recent, largely compatible with 2.7, and is now widely used with packages for specific disciplines. New installations should be Python 3, but there's not much loss of functionality with the older Python 2.7 if you already have it. We will use Python 3 for the examples, though some of the earlier files here may still require small changes to run under 3. <br />
<br />
<br />
'''Linux'''<br />
<br />
Python will already be installed on your computer. Typically the operating system may use 2.7 for some of its core applications, and provide a basic 3.4, 3.5 or 3.6 for newer work. You may use your package manager to update and add to the base installations, but note the distinction between Python2 and Python3 which may co-exist. Check that the pip you are using on the command line is the one to add to the version you want to use. For example, look at <br />
<br />
ls -l /usr/bin/python*<br />
ls -l /usr/bin/pip*<br />
<br />
to see that may be already there and what will run with the default "python" command. A trick used by Linux systems is to have a directory /etc/alternatives that contains soft links, and there you may find links such as<br />
<br />
/etc/alternatives/pip -> /usr/bin/pip3.4<br />
<br />
to tell you how "pip" will run. There may be conflicts with the operating system's requirements and what you would want for your work, but with care they can be managed and you will be in control of your own destiny. If you prefer to let someone else drive, chose a Python distribution such as the popular [https://www.anaconda.com/distribution/ Anaconda] and follow the directions on their website. Keep in mind the potential conflicts with the already-installed python on your computer.<br />
<br />
<br />
<br />
'''Linux with Python from its source'''<br />
<br />
This method is for those who are comfortable with system management and want to maintain full control over their Python and its packages. It enables you to have the very latest Python and It also will minimalize the installation footprint on diskspace, while adding challenges of resolving conflicts between dependencies yourself, and some potentially vexing issues with the operating system conflicts. It is my favorite method.<br />
<br />
<br />
# Download the source tar file currently Python-3.6.4.tgz and as superuser or root copy to /usr/local/src<br />
# Untar the file and assign ownership of the new directory tree to yourself as an unpriviledged user<br />
# As a normal user, cd into the source directory and run ./configure <br />
# The defaults will be fine. Your new Python will go into the /usr/local/ directory. Some users prefer /opt, which can be changed as a configuration option.<br />
# make<br />
# make test<br />
# Now as root user --<br />
# make altinstall<br />
# ln -s /usr/local/lib64/python3.6/lib-dynload/ /usr/local/lib/python3.6/lib-dynload<br />
<br />
<br />
The altinstall option is necessary to avoid overwriting or interfering with the system python. The softlink is needed because some llibrary files in lib64 are not found without it. It is not necessary to assign either PYTHONHOME or PYTHONPATH, or to use an environment manager to have this version work independently of the system version. However, be aware that the functions you need are explicity in /usr/local/bin and that they refer to python by its version, that is ''python3.6'' and ''pip3.6'' Therefore if you later update the OS and it also has these executables, there's a potential conflict that would be resolved by the search path and could be ambiguous.<br />
<br />
Similarly, if you install Anaconda Python, it will have its own /opt directory tree to navigate, while Canopy Python may use environment variables. To run your own locally built Python ''echo PYTHONHOME'' and ''echo PYTHONPATH'' should return empty strings.<br />
<br />
<br />
'''Linux adding modules by pip'''<br />
<br />
<br />
For installing in the system python, if you need to update the complex matplotlib package for Python 3 <br />
that may lack parts you need, it must be removed first:<br />
<br />
pip uninstall matplotlib<br />
<br />
Then re-install it and specify not to use the saved source if any.<br />
<br />
pip install matplotlib --upgrade --no-cache-dir<br />
<br />
Also for a system python version you may need to do this <br />
<br />
pip uninstall six<br />
<br />
pip install six --upgrade --no-cache-dir<br />
<br />
<br />
Now if you are building a Python for science, use the specific pip for it and add the modules you need. This may include several that were installed on the system using yast, as well the matplotlib ones and these. Start with these since pip will resolve dependencies, probably use cached source unless you tell it not to, and in the process grow the missing branches of your Python tree. Later, if you find something missing, you can add it as needed.<br />
<br />
<br />
Install numpy (pip install numpy)<br />
<br />
Install scipy (pip install scipy)<br />
<br />
Install astropy (pip install astropy) for essential astronomy utilities<br />
<br />
Install scikit-image (pip install scikit-image) for image processing<br />
<br />
Install ginga (pip install ginga) for FITS viewer and core modules<br />
<br />
Install pyastronomy (pip install pyastronomy) or from source on github [https://github.com/sczesla/PyAstronomy pyastronomy]<br />
<br />
Install pyephem (pip install pyephem) for astronomical ephemerides<br />
<br />
Install healpy (pip install healpix) for astronomical image processing<br />
<br />
Install reproject (pip install reproject) for image reprojection when doing fits conversion<br />
<br />
Install quantities (pip install quantities) to have physical constants<br />
<br />
Install emcee (pip install emcee) to have an MCMC library <br />
. <br />
<br />
Lastly, install the software chain for data visualization with Python using pip rather than the system package because Pandas is developing rapidly<br />
<br />
Install pandas (pip install pandas)<br />
<br />
Install scrapy (pip install scrapy)<br />
<br />
Install requests (pip install requests)<br />
<br />
<br />
<br />
'''Windows'''<br />
<br />
For Windows there are several choices.<br />
<br />
* [https://www.python.org/downloads/windows/ Python.org] provides installers for Windows. The web-based installer will update software components from the web. You may need administrator privileges to update system libraries.<br />
* [https://www.enthought.com/academic-subscriptions/ Enthought Canopy] is a commercial distribution that is free to download, and for a fee will offer support. It is intended for scientific computing and can co-exist with the system Python of Linux. <br />
* [https://www.anaconda.com/distribution/ Anaconda] is widely used in Astronomy, and will come with all the packages you will need to get started. It uses a "conda" package management system. <br />
<br />
<br />
<br />
'''Mac OSX'''<br />
<br />
* Python 2.7 comes installed with OSX. Try "python --version" from a terminal command line and see what happens. You can update this installation from Python.org (see next), or add package with pip given adminsitrative authority. Be aware of the potential Tkl library problem though.<br />
* [https://www.python.org/downloads/mac-osx/ Python.org] has installers for recent Mac OS variants. However, there are problems with the Tkl libraries provided in by Apple, particularly when used for graphics and in the development environment IDLE, which you should be aware of. Read the notice [https://www.python.org/download/mac/tcltk/ here].<br />
* [https://store.enthought.com/downloads/ Enthought Canopy Express] is free for Mac users too. Enthought provides all the packages in one installation process, and additonal support for a fee.<br />
* [https://www.anaconda.com/distribution/ Anaconda] also has a Mac version, and is very popular.<br />
<br />
<br />
Those with an astronomical interest may benefit from [http://python4astronomers.github.com/installation/python_install.html Python4Astronomers]<br />
<br />
Most users would probably prefer running Python through the [http://docs.python.org/2/library/idle.html IDLE] integrated development environment. This provides an editor and file management, along with help and syntax highlighting. It's named after Eric Idle, who does the [http://www.youtube.com/watch?v=uo6OCxwUPPg "Galaxy Song"] in Monty Python. On the command line you would simple run "idle" to get started. <br />
<br />
<br />
Additional modules would have to be installed separately later if they are not part of the original installation. Python has its own ''pip'' (see above for Linux) for adding features which makes that easy. The ones you will need for scientific programming are <br />
<br />
* NumPy Test with "import numpy from within interactive Python or idle.<br />
* SciPy Test with "import scipy".<br />
* AstroPy This one for astronomers. Test with "import astropy".<br />
<br />
and there are others, especially from [https://www.scipy.org/scikits.html Scikits]<br />
<br />
Anaconda and Enthought distributions will have everything you need "out of the box."<br />
<br />
<br />
'''AstroConda for Linux and Mac OSX'''<br />
<br />
If you are primarily interested in using Python for astronomy and have a need for the tools of the Space Telescope Science Institute, consider adding their astronomical code to an Anaconda distribution. At this site<br />
<br />
lhttp://astroconda.readthedocs.io/en/latest/ AstroConda] <br />
<br />
there is a guide to installation and the documentation on its use. If you are unfamiliar with Python, take the time to go through our short course and try some examples first, and then return and fill in your system with AstroConda and you will be ready for analyzing data from MAST, HST and other sources. AstroConda is for Linux and OSX, but Microsoft Windows is not supported. If you have a Windows computer with a lot of memory and disk space, you could add a Linux virtual machine inside our Windows operating system, as a safe simple way to have Linux features available within your preferred OS. Virtualbox is free and installation from Oracle is only a click away:<br />
<br />
[https://www.virtualbox.org/ Oracle VirtuaBox]<br />
<br />
It delivers enough processing power of the host computer that for many applications it is as good as running a "real" machine, and it protects your own operating system while you experiment with new ones. <br />
<br />
<br />
<br />
== IDE's and Editors and Python environments ==<br />
<br />
Keep in mind that Python itself is a programming language and system. It stands on its own, and it can be incorporated into other more complex, and potentially more useful interfaces. At a minimum you will need a text editor. With that you can read and write program files, and run them as a program either by having "python" read the file, or by making the file itself executable (on Linux and Mac). Most editors on these operating systems will be fine, but some are cumbersome for learning. Unless you happen to have the skill, avoid "vi" and even "emacs", two common editors of Unix-like systems like OSX and LInux, and use something with a lighter interface. The java-based "jedit" is free, easy to install, and has some helpful features. Since it is based on java, it runs on Mac, Windows and Linux with the same look and feel. You can obtain it from<br />
[http://www.jedit.org/ http://www.jedit.org/] and follow their installation instructions.<br />
<br />
The IDE "idle" is also very nice to start with, and recommended. It may be present after you install Python, so try the command "idle" in a terminal window and see what happens. There are said to be problems with its use of the Tkl library and the Mac OSX installed libraries, but they should be solved in the most recent releases of Python and OSX supplied by Enthought or Anaconda.<br />
<br />
Now widely used and with great potential, the Jupyter system has been under development for a decade and is mature. You can read about it and even preview its capabilities on the web. Keep in mind if you decide to start at that level, that the system is feature-filled, and that once you create content within the system it will require the system to use that content. That is, unlike a simple Python program, the notebooks created by Jupyter are truly bodies of work that include data and analysis. It can be very useful in a lab, for example. <br />
<br />
<center> [http://jupyter.org/ http://jupyter.org/] </center><br />
<br />
<br />
== Using Python in real time ==<br />
<br />
The first step is to figure out how to start up Python on your computer after it is installed. In Linux you open a console and type "python" on the command line. You'll immediately see a prompt that looks like ">>" after which you can type Python code and see the results.<br />
<br />
If you installed the Enthought distribution of Python on Windows or Mac, take a look at their release notes and website for additional advice on getting started. <br />
<br />
If you installed from the python.org, then they have some additional pages to offer help.<br />
On Windows, its not necessarily as straightforward as Linux, but it can be. It will help to read this [http://docs.python.org/2/faq/windows.html "frequently asked question" (FAQ) page] about Python on Windows to help you at first, and also consult [http://docs.python.org/2/using/windows.html setup and usage guide].<br />
On a Macintosh OS X system using Python is very similar to other Unix platforms like Linux or BSD. There are some helpful notes at the [http://docs.python.org/2/using/mac.html Using Python on a Macintosh] website. <br />
<br />
Once you have a command line prompt you have access to all of Python's capabilities. We'll show you some simple examples [http://prancer.physics.louisville.edu/astrowiki/index.php?title=Python_examples here] to test your installation and give you a quick sense of how to use it.<br />
<br />
To exit Python in the interactive mode, use "Ctrl+d" or "exit()" from the command line, or the Exit menu entry if you are running IDLE.<br />
<br />
<br />
== Using Python code as a standalone program ==<br />
<br />
You will usually edit a file that contains your Python program and then run that program by calling the Python interpreter. Therefore, the first thing is to pick an appropriate editor. <br />
One way is to use IDLE, which makes it especially easy on Windows systems and others to edit and test with a consistent interface. On Linux systems where the command line is more commonly used, an alternative is a standard graphical editor that is aware of Python syntax like gedit. Other alternatives for Linux users are nedit and emacs, depending on your taste in interfaces. Python text files have a required format, and it generally not a good idea to embed tabs in the text so the tab function has to be set for spaces instead. The Python website maintains a [http://wiki.python.org/moin/PythonEditors list of editors] and their features for different operating systems with links to the editor websites if you need to download one. <br />
<br />
For example, if your program is in the file "myprogram.py" you can run it from the command line with "python myprogram.py". On Windows systems, the file extension ".py" may be associated with this command, and in that case you can start a program by clicking on the icon or name in a window. On MacOS and Linux, you would first make the file executable with a command such as <br />
<br />
chmod a+x myfile.py<br />
<br />
and also see that the first line of the file is exactly<br />
<br />
#!/usr/bin/python<br />
<br />
assuming that python is installed in /usr/bin/. With those changes, any file of Python code becomes an executable program. Simply type<br />
<br />
myfile.py<br />
<br />
Note that programs that interact with the window manager may need to be started with pythonw instead of python. For MacOS, see [http://docs.python.org/2/using/mac.html 4.1.1 How to run a Python script].<br />
<br />
<br />
== Examples of very simple Python ==<br />
<br />
<br />
For examples of Python illustrating how to use it interactively and to write very simple programs, see the section [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples Python examples].<br />
<br />
<br />
<br />
== An assignment to try out very simple Python ==<br />
<br />
<br />
For the assigned homework to use very simple Python interactively and as a script, see the section [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments Python assignments].</div>WikiSysophttps://prancer.physics.louisville.edu/astrowiki/index.php?title=Image_processing_with_Python_and_SciPy&diff=2539Image processing with Python and SciPy2018-04-14T13:48:29Z<p>WikiSysop: </p>
<hr />
<div>Given that NumPy provides multidimensional arrays, and that there is core support through the Python Imaging Library and Matplotlib to display images and manipulate images in the Python environment, it's easy to take the next step and combine these for scientific image processing. As part of our [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_for_Physics_and_Astronomy short course on Python for Physics and Astronomy] we begin by exploring how Python handles image input and output through pillow, scikit-image, and pyfits. Once loaded, an image may be processed using library routines or by mathematical operations that would take advantage of the speed and conciseness of numpy and scipy. Some of the resources mentioned here require Python >3.4, and at this time Python 3.6 is the current one.<br />
<br />
<br />
== Pillow - An Imaging Library ==<br />
<br />
The Python Imaging Library (PIL) was developed for Python 2.x and provided functions to manipulate images, including reading, modifying and saving in various standard image formats in a package called "PIL". With the coming of age of Python 3.x, a fork of the older version has evolved that is more suited for the new technologies and is in a package called "Pillow". It continues to improve, and the features described here are tested with "Pillow 5.1" and Python 3.6 as of April 2018. Pillow will probably be on any packaged distribution of Python 3, or it may be installed with (note the capital "P")<br />
<br />
pip install Pillow<br />
<br />
Pillow includes the basics of image processing. with functions that are documented by the developers in a [https://pillow.readthedocs.io/en/5.1.x/handbook/index.html handbook] describing the methods and giving some examples. Pillow uses the same "namespace" as PIL and older code should work, perhaps with a few modifications to allow for recent developments. Most import for us, Pillow has routines to read and write conventional image formats. Once an image has been read into a numpy array, the full power of Python is available to process it, and we can turn to Pillow again to save a processed image in png or jpg or another format. Flexibile Image Transport System (FITS) files used for astronomy should be managed with astropy or pyfits.<br />
<br />
As a simple starting example, suppose you have an image that was taken with the camera turned so that "up" is to the side when the image is displayed. Here's how you would read it, rotate it 90 degrees, and write it out again using Pillow.<br />
<br />
import os <br />
from PIL import Image as pil<br />
<br />
parser= argparse.ArgumentParser(description = 'Rotate a png image 90 degrees')<br />
<br />
if len(sys.argv) == 1:<br />
sys.exit("Usage: png_image_rotate file.png ")<br />
exit()<br />
elif len(sys.argv) == 2:<br />
infilename = sys.argv[1]<br />
else:<br />
sys.exit("Usage: png_image_rotate file.png ")<br />
exit() <br />
<br />
myimage = pil.open(infilename)<br />
<br />
mirror = myimage.transpose(pil.ROTATE_90)<br />
outfilename = os.path.splitext(os.path.basename(infilename))[0]+'_r90.png'<br />
mirror.save(outfilename)<br />
<br />
<br />
The first part of this is standard form to get the image name on the command line and make it available to the program. The PIL is imported with Image, and appears in the code as<br />
"pil". This is an amazingly short program, because in opening the image the library handles all the conversions in formatting and stores the image internally so that you refer to it only by the name assigned when it is loaded. We operate on the image with the transpose function, which has an argument that controls what it does. Here we rotate the image 90 degrees, and then save it to a file with a new name. The saving operation converts the internal data back to the file format set by the extension used in the file name.<br />
<br />
You can transpose an image left-right with<br />
<br />
mirror = myimage.transpose(pil.FLIP_LEFT_RIGHT)<br />
<br />
or do both in one step with<br />
<br />
mirror = myimage.transpose(Image.FLIP_LEFT_RIGHT).transpose(pil.ROTATE_90)<br />
<br />
Processing is not limited to "PNG" files, though that file type is preferred because it is not a lossy storage option. Python reads and writes "jpg" files too. While PIL provides some essential functionality, for more practical uses in astronomy we need to read Flexible Image Transport or "FITS" files, and to enable numerical work on images in SciPy. Many of the processing functions you will find in Python Imaging Library (PIL) are also available in SciPy where we have precise mathematical control over their definitions and operation. Some more advanced techiques are available in SciPy too, courtesy of researchers who have contributed to SciKit as we will see.<br />
<br />
Python's core routines dependent on matplotlib may be used to display an image, but these are designed for graphics, and limited by the constraints of the matplotlib interface. With a little effort there are better choices.<br />
<br />
<br />
== SciKit Image ==<br />
<br />
We've mentioned that [https://www.scipy.org/scikits.html SciKits] is a searchable index of highly specialized tools that are built on SciPy and NumPy. Among them, [http://scikit-image.org/ scikit-image] is for image processing in Python. It is oriented toward extracting physical information from images, and has routines for reading, writing, and modifying images that are powerful, and fast. Scikit-image is often compared to OpenCV, a collection of programs for computer vision that include live video. Both are actively maintained and in many ways complementary, but for physics and astronomy scikit-image is more powerful at this time.<br />
<br />
The scikit-image package is part of Anaconda and Enthought Python, and that would be a recommended platform for Windows. However for Linux and Mac OSX, <br />
<br />
pip install scikit-image<br />
<br />
should work. The package usually requires local compilation of the code when installed this way. Qt bindings are provided by PySide or by Qt, depending on licensing requirements. PySide is less restrictive.<br />
<br />
pip install PySide<br />
<br />
<br />
While scipy has included an image reader and writer, as of April 2018 this function is deprecated in the base code and rather than use pillow, we can turn to scikit-image. The module to read and write image is skimage.io<br />
<br />
import skimage.io<br />
import numpy as np<br />
<br />
and the command<br />
<br />
skimage.io.find_available_plugins()<br />
<br />
will provide a dictionary of the libraries that may be used to read various file types. For example<br />
<br />
imlibs = skimage.io.find_available_plugins()<br />
<br />
and imlibs['pil'] will list the functions that the Python imaging library provides. The package tries the libraries in order until it finds one that works:<br />
<br />
myimage = skimage.io.imread(filename).astype( np.float32)<br />
<br />
will read an image and return a numpy array which by default will be an RGB image if the file is a png file, for example. A greyscale image image be specified by including as_grey=True as an argument. A numpy image has a shape that for color has 3 values in each pixel<br />
<br />
print(myimage.shape)<br />
(498, 680, 3)<br />
<br />
as an example for an RGB image, or <br />
<br />
(498,680)<br />
<br />
for a gray scale image. Since numpy by default would store into a 64-bit float and matplotlib (the default display for skimage) requires 32-bit, we specify loading into a 32 bit array while planning ahead to seeing the result.<br />
<br />
<br />
Images may be saved:<br />
<br />
skimage.io.imsave(filename, nparray)<br />
<br />
and the file type is determined by file name extension. <br />
<br />
Images may be displayed, but it takes two steps<br />
<br />
skimage.io.imshow(myimage)<br />
skimage.io.show()<br />
<br />
when invoking the default matplotlib plugin. The display will look like one created by pyplot. There is a simpler, viewer module too, without pyplot toolbar.<br />
<br />
import skimage.viewer<br />
<br />
viewer = skimage.viewer.viewers.ImageViewer(myimage)<br />
viewer.show<br />
<br />
<br />
We will use routines from the scikit-image package to identify stars in an image, adapting a program given by Eli Bressert in his book SciPy and NumPy to handle an image from one of our telescopes. The complete program is given in the Examples section below.<br />
<br />
It begins with the requisite imports including the ones from SciKit <br />
<br />
import os<br />
import sys<br />
import argparse<br />
import matplotlib.pyplot as mpl<br />
import numpy as np<br />
import astropy.io.fits as pyfits<br />
import skimage.morphology as morph<br />
import skimage.exposure as skie<br />
<br />
It opens an FITS file and loads only part of the image<br />
<br />
img = pyfits.getdata(infits)[1000:3000, 1000:3000]<br />
<br />
to insure a square sample area and to limit the size of the search region. This is an alternative to reading a FITS image and also getting the header.<br />
<br />
With this data the program creates a new image using an mapping arc sinh that captures the full dynamic range effectively. It locates lower and upper bounds that should include only stars. The parameters would probably have to be refined to optimize the extraction of stars from background.<br />
<br />
limg = np.arcsinh(img)<br />
limg = limg / limg.max()<br />
low = np.percentile(limg, 0.25)<br />
high = np.percentile(limg, 99.5)<br />
opt_img = skie.exposure.rescale_intensity(limg, in_range=(low,high))<br />
<br />
With the useful range determined, we create a new image that is scaled between the lower and upper limits that will be used for displaying the star map. We search the arcsinh-stretched original image for local maxima and catalog those brighter than a threshold that is adjusted based on the image.<br />
<br />
lm = morph.is_local_maximum(limg)<br />
x1, y1 = np.where(lm.T == True)<br />
v = limg[(y1,x1)]<br />
lim = 0.7<br />
x2, y2 = x1[v > lim], y1[v > lim]<br />
<br />
The list x2,y2 has the stars the were found. The rest is image display code to draw circles around the stars and create an image that shows where they are.<br />
<br />
<br />
[[File:m34_100s_log1000_rgb_20121104.png | center | 400px]]<br />
<br />
When the program processes a 100 second image of the open cluster M34, shown above, it identifies the stars in the right panel below.<br />
<br />
[[File:test_m34_stars.png | center | 600px]]<br />
<br />
== Images with NumPy and SciPy ==<br />
<br />
SciPy can read jpg and png images directly, without using PIL. With SciPy images are stored in numpy arrays, and we have direct access to the data for uses other than visualization.<br />
<br />
import numpy as np<br />
import matplotlib.pyplot as plt<br />
import skimage.io.imread as imread<br />
import skimage.io.imsave as imsave<br />
#import skimage.io.imshow as imshow is an alternative for display<br />
<br />
image_data = imread('test.jpg').astype(np.float32)<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape<br />
<br />
scaled_image_data = image_data / 255.<br />
<br />
# Save the modified image if you want to<br />
# imsave('test_out.png', scaled_image_data)<br />
<br />
plt.imshow(scaled_image_data)<br />
plt.show()<br />
<br />
exit()<br />
<br />
For our 512x512 color test image, this returns<br />
<br />
Size: 786432<br />
Shape: (512, 512, 3)<br />
<br />
because the image is 512x512 pixels and has 3 planes -- red, green, and blue. When SciPy reads a jpg or png image it will separate the colors for you. The "image" is a data cube. In the imread line we control the data type within the Python environment. Of course the initial data is typically 8 bits for color images from a cell phone camera, 16 bits for scientific images from a CCD, and perhaps 32 bits for processed images that require the additional dynamic range.<br />
<br />
We can display images with matplotlib.pyplot using [http://matplotlib.org/api/pyplot_api.html#matplotlib.pyplot.imshow imshow()] --<br />
<br />
imshow(X, cmap=None, norm=None, aspect=None, interpolation=None,<br />
alpha=None, vmin=None, vmax=None, origin=None, extent=None, **kwargs)<br />
<br />
where "X" is the image array. If X is 3-dimensional, imshow will display a color image.<br />
Matplotlib has a [http://matplotlib.org/users/image_tutorial.html tutorial] on how to manage images. Here, we linearly scale the image data because for floating point imshow requires values between 0. and 1., and we know beforehand that the image is 8-bits with maximum values of 255. <br />
<br />
Here's what a single test image displayed from this program looks like in Python.<br />
<br />
<br />
[[File:Single_image.png | center | 600px]]<br />
<br />
<br />
We can slice the image data to see each color plane by creating new arrays indexed from the original one. <br />
<br />
import numpy as np<br />
from scipy.misc import imread, imsave<br />
import pylab as plt<br />
<br />
image_data = imread('test.jpg').astype(np.float32)<br />
scaled_image_data = image_data / 255.<br />
<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape<br />
<br />
image_slice_red = scaled_image_data[:,:,0]<br />
image_slice_green = scaled_image_data[:,:,1]<br />
image_slice_blue = scaled_image_data[:,:,2]<br />
<br />
print 'Size: ', image_slice_0.size<br />
print 'Shape: ', image_slice_0.shape<br />
<br />
plt.subplot(221)<br />
plt.imshow(image_slice_red, cmap=plt.cm.Reds_r)<br />
<br />
plt.subplot(222)<br />
plt.imshow(image_slice_green, cmap=plt.cm.Greens_r')<br />
<br />
plt.subplot(223)<br />
plt.imshow(image_slice_blue, cmap=plt.cm.Blues_r) <br />
<br />
plt.subplot(224)<br />
plt.imshow(scaled_image_data) <br />
<br />
plt.show()<br />
<br />
<br />
For a colorful image you will see the differences between each slice --<br />
<br />
[[File:single_image_slice.png | center | 600px ]]<br />
<br />
<br />
This approach offers a template for displaying multidimensional computed or experimental data as an image created with Python. Consider this short program that creates and displays an image with Gaussian noise:<br />
<br />
# Import the packages you need<br />
import numpy as np<br />
import matplotlib.pyplot as plt<br />
from scipy.misc import imsave<br />
<br />
# Create the image data<br />
image_data = np.zeros(512*512, dtype=np.float32).reshape(512,512)<br />
random_data = np.random.randn(512,512)<br />
image_data = image_data + 100.*random_data<br />
<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape <br />
scaled_image_data = image_data / 255.<br />
<br />
# Save and display the image <br />
imsave('noise.png', scaled_image_data)<br />
plt.imshow(scaled_image_data, cmap='gray')<br />
plt.show()<br />
<br />
exit()<br />
<br />
<br />
Instead of random noise, you could create any functional data you wanted to display. For images with color, the NumPy array would have red, green, and blue planes.<br />
<br />
<br />
== Astronomical FITS Files == <br />
<br />
Astronomical image data are potentially complex and rich, for which quantitative structures have been developed to standardize lossless storage of the data along with the metadata that describe its origin and previous processing. While photographic images are often only 8 bits deep, mixed with red, green and blue in a single image, and compressed to reduce file size, astronomical images are 16 or 32 bits deep in a single color. Until recently, the file sizes needed for astronomy were unrivaled by commodity cameras, but in todays market of megapixel imaging on cell phones, the camera in your pocket also produces very large rich images. The difference in handling them is that for science we want to preserve the data without loss, quantitatively calibrate and measure the flux from the source and map that back to a specific angle in space, while for art or even for some academic uses, it is the beauty, color, and highlights shown in the image display that are important. Commodity images are saved usually in compressed formats such as JPG, or uncompressed TIFF or proprietary binary formats. For astronomy and other quantitative imaging work, the [https://fits.gsfc.nasa.gov/fits_primer.html Flexible Image Transport System (or FITS)] format is almost universal. It includes the image data, and a header describing the data. FITS files may also be tables of data, or a cube of images in sequence. The standards developed for creating these files are slowing evolving as the needs of big data in astronomy have grown. Programmers and scientists at NASA, the Space Telescope Science Institute, and the academic community at large are contributing to libraries that enable reading, processing, and saving FITS files in Python, as well as in C, and Fortran. <br />
<br />
Once a FITS file has been read, the header its accessible as a Python dictionary of the data contents, and the image data are in a NumPy array. With Python using NumPy and SciPy you can read, extract information, modify, display, create and save image data. <br />
<br />
<br />
=== Reading and Writing a FITS File in Python ===<br />
<br />
There are many image display tools for astronomy, and perhaps the most widely used is [http://hea-www.harvard.edu/RD/ds9/site/Home.html ds9] which is available for Linux, MacOS, and Windows, as well as in source code. It is always useful to have a version on your computer when you are working with FITS images. A more versatile Java platform for astronomical image viewing that also does and processing' is now widely used for precision astronomical photometry where interactive analysis is needed. AstroImageJ is free and simple to install on most computers, and because it is also a powerful processor, for many purposes it is an all-in-one tool. If you are working with FITS images, this is highly recommended: <br />
<br />
[http://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ] website and program source<br />
<br />
[http://iopscience.iop.org/article/10.3847/1538-3881/153/2/77 Astronomical Journal] article describing Astroimagej<br />
<br />
<br />
However, sometimes we want perform specialized work on image data, and to view it while processing it in Python. This routine demonstrates how to read a FITS file, inspect its header, and show the image on the computer display. It is dependent on the "PyFITS" library developed at Space Telescope Science Institute and incorporated into a larger package by the [http://www.astropy.org/ AstroPy Project]. AstroPy's library is part of the Enthought and Canopy distributions of Python. If you are using a system version of python, you may need to install it with pip<br />
<br />
pip install astropy<br />
<br />
should do it. The package have several dependences on python (it requires version 3.5 or higher at this time (April 2018), and on numpy and scipy. It is also frequently updated, and while stable may push the "cutting edge" of distributions from conservative operating system distributors like OpenSuse. That's a good reason, if you need this capability, to have a version of Python built with current sources, or to use a complete distribution such as Anaconda or Enthought.<br />
<br />
A program to work with FITS files would begin by importing the packages that it needs<br />
<br />
<br />
import os<br />
import sys<br />
import argparse<br />
import numpy as np<br />
import astropy.io.fits as pyfits<br />
import matplotlib <br />
import matplotlib.pyplot<br />
<br />
Importing the FITS modules in this way makes the code backward compatible with the earlier versions of PyFITS. Also, since sometimes submodules are not loaded with the larger package, we explicitly ask for the io.fits components, and for pyplot.<br />
<br />
# Define a function for making a linear gray scale<br />
def lingray(x, a=None, b=None):<br />
"""<br />
Auxiliary function that specifies the linear gray scale.<br />
a and b are the cutoffs : if not specified, min and max are used<br />
"""<br />
if a == None:<br />
a = np.min(x)<br />
if b == None:<br />
b = np.max(x)<br />
return 255.0 * (x-float(a))/(b-a)<br />
<br />
# Define a function for making a logarithmic gray scale<br />
def loggray(x, a=None, b=None):<br />
"""<br />
Auxiliary function that specifies the logarithmic gray scale.<br />
a and b are the cutoffs : if not specified, min and max are used<br />
"""<br />
if a == None:<br />
a = np.min(x)<br />
if b == None:<br />
b = np.max(x) <br />
linval = 10.0 + 990.0 * (x-float(a))/(b-a)<br />
return (np.log10(linval)-1.0)*0.5 * 255.0<br />
<br />
These functions may be used to rescale the data for display. Scaling can be done with a colormap, or by modifying the data before displaying. Here, we modify the data and save it in a temporary buffer for display. In a more elaborate program with a full user interface, this scaling could be done interactively.<br />
<br />
# Provide information to the argparse routine if we need it<br />
parser= argparse.ArgumentParser(description = 'Display a fits image')<br />
<br />
The argparse function offers more flexibility than the system routines, though here we only include this for future additions. A simpler method is to parse the command line itself using the system utilities:<br />
<br />
# Test for command line arguments<br />
if len(sys.argv) == 1:<br />
sys.exit("Usage: display_fits infile.fits ")<br />
exit()<br />
elif len(sys.argv) == 2:<br />
infits = sys.argv[1]<br />
else:<br />
sys.exit("Usage: display_fits infile.fits ")<br />
exit() <br />
<br />
# Open the fits file and create an hdulist<br />
inhdulist = pyfits.open(infits) <br />
<br />
# Assign the input header in case it is needed later<br />
inhdr = inhdulist[0].header<br />
<br />
# Assign image data to a numpy array<br />
image_data = inhdulist[0].data<br />
<br />
The header and data are now available. We'll look at header information later. For now, all we need are the values in the numpy data array. It will be indexed from [0,0] at the upper left of the data space, which would be the upper left of the displayed image.<br />
<br />
# Print information about the image<br />
print 'Size: ', image_data.size<br />
print 'Shape: ', image_data.shape<br />
<br />
For this example we use linear scaling.<br />
<br />
# Show the image<br />
new_image_data = lingray(image_data)<br />
new_image_min = 0.<br />
new_image_max = np.max(new_image_data)<br />
matplotlib.pyplot.imshow(new_image_data, vmin = new_image_min, vmax = new_image_max, cmap ='gray') <br />
matplotlib.pyplot.show() <br />
<br />
# Close the input image file and exit<br />
inhdulist.close()<br />
exit ()<br />
<br />
It would be a straightforward exercise to add an interactive scale control and to read out the value of each pixel in this program. With that, it has the basic functionality of ds9 or the AstroImageJ viewer.<br />
<br />
<br />
=== Correcting and Combining Images ===<br />
<br />
There are processing operations done on all "raw" astronomical images taken with [http://prancer.physics.louisville.edu/astrowiki/index.php/Use_a_CCD_Camera charge coupled device cameras]. Since the voltage that is digitized is proprotional to the number of photons that arrived at each pixel during the exposure, the data for that pixel should be proportional to the photon count, that is to the irradiance of photons/area-time times the area of the pixel times the exposure time. An absolute conversion to the flux from the sources that are imaged requires correcting for<br />
<br />
*Signal at each pixel with no light present -- the "dark" image<br />
*Signal at each pixel for the same irradiance/pixel -- the "flat" field<br />
*Non-linear responses<br />
*Absolute calibration to an energy or photon flux based on spectral response<br />
<br />
We can do all of these in NumPy using its built-in array math. By taking an image with no light and subtracting it from an image, we have corrected for the dark response (as well as for electronic "bias"). Or, if needed, we can scale a dark image taken at a different exposure time from the image we are measuring, and then subtract that. We can divide an image by a reference flat to correct for pixel-to-pixel variations in response, for vignetting in an optical system (the non-linear fall-off of transmission across the field of view). We can scale images non-linearly to correct for non-linear amplifier responses and saturation or charge loss from each pixel.<br />
<br />
For dark subtraction we take the difference of two images:<br />
<br />
dark_corrected_image = raw_image - reference_dark_image<br />
<br />
and for flat field correction we divide<br />
<br />
final_image = dark_corrected_image / reference_flat_image<br />
<br />
In NumPy there is no array indexing needed, and the operations are one-liners. Similarly, a non-linear second order correction or a scaling to physical units may be done on the entire array with<br />
<br />
corrected_image = a * (final_image) + b * (final_image**2) <br />
<br />
The reference dark and flat images must be obtained beforehand. For example, a reference dark image may be a median average of many images taken with the same exposure time as the science image, but with the shutter closed. To perform the median operation on the arrays rather than sequentially on the elements, we stack all of the original individual dark images to make a 3-d stack of 2-d arrays. Using numpy arrays we would have<br />
<br />
dark_stack = np.array([dark_1, dark_2, dark_3])<br />
<br />
where dark_1, dark_2, and dark_3 are the original dark images. We need at least 3, or any odd number in the dark stack. If the images are m rows of n columns, and if we have k images in the stack, the stack will have a shape (k,n,m): k images, each of n rows, each of m columns. A median on the first axis of this stack returns the median value for each pixel in the stack --<br />
<br />
dark_median = np.median(dark_stack, axis=0)<br />
<br />
and has a shape that is (n,m) with the elements that we wanted. <br />
<br />
Median operations on a image stack remove random noise more effectively than averaging because one source of noise in CCD images is cosmic ray events that produce an occasional large signal at a pixel. If we mean-averaged with an outlier in one pixel the result would be too large because of the one singular event in the stack, but by median-averaging we discard that event without adversely affecting the new reference frame.<br />
<br />
A median of a stack of flat frames, all normalized so that they should be identical, will remove stars from the reference image as long as each contributing flat image in the stack is taken of a different star field. Typically, these "sky flats" are images taken at twilight, processed to remove the dark signal, normalized to unity, and then median averaged to remove stars and reduce random noise. <br />
<br />
Fortunately, normalizing an image is very simple because<br />
<br />
image_mean = np.mean(image_data)<br />
<br />
returns the mean value of all elements in the array. Divide an image by its mean to create a normalized image with unity mean<br />
<br />
normalized_image = image_data / image_mean<br />
<br />
<br />
As long as the images have the same size you can sum them by simple addition<br />
<br />
sum_image = image1 + image2 + image3 ...<br />
<br />
We would do this to create a final image that is effectively one long exposure, the sum of all the contributing image exposure times. Because of guiding errors, cosmic rays, and weather, one very long exposure is often not possible, but 10's or 100's of shorter exposures can be "co-added" after selecting the best ones and aligning them so that each pixel in the contributing image corresponds to the same place in the sky.<br />
<br />
<br />
=== Masked Image Operations ===<br />
<br />
A [http://docs.scipy.org/doc/numpy/reference/maskedarray.generic.html Numpy array mask] is a boolean array that determines whether or not an operation is to be performed. If you have an image in a array, the mask allows you to work on only part of the image, ignoring the other part. This is useful for finding the mean of a selected region, or for computing a function that fits part of an image but ignores another part. For example, consider an array<br />
<br />
x = np.arange(4*4).reshape(4,4)<br />
<br />
which is a simple 4x4 image with values from 0 to 15:<br />
<br />
print x<br />
array([[ 0, 1, 2, 3],<br />
[ 4, 5, 6, 7],<br />
[ 8, 9, 10, 11],<br />
[12, 13, 14, 15]])<br />
<br />
Make a copy of ths image which is a boolean mask<br />
<br />
xmask = np.ma.make_mask(x,copy=True,shrink=True,dtype=np.bool)<br />
<br />
and it will have all True values except for the first entry, 0, which is interpreted False:<br />
<br />
print xmask<br />
array([[False, True, True, True],<br />
[ True, True, True, True],<br />
[ True, True, True, True],<br />
[ True, True, True, True]], dtype=bool)<br />
<br />
You can set all values (or individual ones) to the state you need<br />
<br />
xmask[:,:] = True<br />
<br />
will make them all True, or <br />
<br />
xmask[0,:] = False<br />
xmask[3,:] = False<br />
xmask[:,0] = False<br />
xmask[:,3] = False<br />
<br />
will set the values around the perimeter False and leave the others True<br />
<br />
print xmask<br />
array([[False, False, False, False],<br />
[False, True, True, False],<br />
[False, True, True, False],<br />
[False, False, False, False]], dtype=bool)<br />
<br />
<br />
We apply the mask to the original data<br />
<br />
mx = np.ma.masked_array(x, mask=xmask)<br />
print mx<br />
<br />
[[0 1 2 3]<br />
[4 -- -- 7]<br />
[8 -- -- 11]<br />
[12 13 14 15]]<br />
<br />
and see that the values that are masked by "True" are not included in the new masked array. If we want the sum of the elements in the masked array, then <br />
<br />
np.sum(mx)<br />
90<br />
<br />
while<br />
<br />
np.sum(x)<br />
120<br />
<br />
There are many ways to create a mask, and to operate on masked arrays, described in the [http://docs.scipy.org/doc/numpy/reference/maskedarray.generic.html Numpy documentation].<br />
<br />
<br />
=== FITS Headers ===<br />
<br />
FITS files contain a "header" that tells us what is in the file, and then data in a format that is defined by the header. As we have seen, when a FITS file is read in NumPy with PyFITS, the data and the header are separate entities. Here's the complete header from an image taken with one of our telescopes:<br />
<br />
<br />
<br />
The first entries tell us it is a simple image file, 4096x4096 pixels (16 megapixels) written with 16 integer data bits per pixel. The other entries provide information about the image data. Therefore in dealing with FITS data we may need to change the first entries if the file is modified, and append new entries that annotate what has been done, or add to the archival notes.<br />
<br />
We will open an image ''v1701_00146_i.fits'' in interactive Python and look at the header.<br />
<br />
import numpy as np<br />
import astropy.io.fits as pyfits<br />
infits='v1701_00146_i.fits'<br />
inhdulist = pyfits.open(infits)<br />
inhdr = inhdulist[0].header<br />
inhdr<br />
<br />
<br />
SIMPLE = T / file does conform to FITS standard<br />
BITPIX = 16 / number of bits per data pixel<br />
NAXIS = 2 / number of data axes<br />
NAXIS1 = 4096 / length of data axis 1<br />
NAXIS2 = 4096 / length of data axis 2<br />
EXTEND = T / FITS dataset may contain extensions<br />
COMMENT FITS (Flexible Image Transport System) format is defined in 'Astronomy<br />
COMMENT and Astrophysics', volume 376, page 359; bibcode: 2001A&A...376..359H<br />
BZERO = 32768 / offset data range to that of unsigned short<br />
BSCALE = 1 / default scaling factor<br />
EXPTIME = 100. / exposure time (seconds)<br />
DATE-OBS= '2013-01-19T04:18:09.140' / date of observation (UT)<br />
IMAGETYP= 'Light Frame' / image type<br />
TARGET = 'V1701 ' / target<br />
INSTRUME= 'griz ' / instrument<br />
CCD-TEMP= -9.921 / temperature (C)<br />
FILTER = ' (3) i (700-825)' / filter<br />
TELESCOP= 'CDK20N ' / telescope<br />
DATE = '2013-01-19T04:20:08' / file creation date (YYYY-MM-DDThh:mm:ss UT)<br />
<br />
The first entries tell us it is a simple image file, 4096x4096 pixels (16 megapixels) written with 16 integer data bits per pixel. The other entries provide information about the image data. Therefore in dealing with FITS data we may need to change the first entries if the file is modified, and append new entries that annotate what has been done, or add to the archival notes.<br />
<br />
The header in pyfits is a Python dictionary. If you want to know the exposure time, ask<br />
<br />
inhdr['EXPTIME']<br />
<br />
and Python responds<br />
<br />
100.<br />
<br />
or <br />
<br />
inhdr['DATE-OBS']<br />
<br />
gets<br />
<br />
'2013-01-19T04:18:09.140'<br />
<br />
This means that you can sort through the contents of headers in a Python program to find the exposures you need, identify the filters used, and see what processing has been done. Most of the KEYWORDS shown above are standard, and those that are not can be easily added to specialized Python code.<br />
<br />
When a new FITS image is written with pyfits it contains only the bare necessities in the header -- the data type, some reference values for zero and scaling if needed, the size of the array. However you can copy items from the header of an image into images you create, and annotate the headers of your work to maintain a record of what has been done.<br />
<br />
An output FITS image file is created in steps from a NumPy data array outimage with<br />
<br />
outhdu = pyfits.PrimaryHDU(outimage)<br />
<br />
which encapsulates an image in an "HDU" object. The line<br />
<br />
outhdulist = pyfits.HDUList([outhdu])<br />
<br />
creates a list that contains the primary HDU which will have a default header<br />
<br />
outhdr = outhdulist[0].header<br />
<br />
Now you can append to this header or modify it<br />
<br />
history = 'This is what I did to the file.'<br />
outhdr.append(('HISTORY',history))<br />
more_history = 'I did this today.'<br />
outhdr.append(('HISTORY',more_history))<br />
<br />
since it is only a standard Python dictionary.<br />
<br />
<br />
For astronomical images headers may also include information that maps the image to the sky, a World Coordinate System or WCS header within the FITS header. These keywords apply only if the spatial mapping of the image is unchanged, and in processing that shifts or distorts images the WCS header should not be copied. Setting the WCS header and interpreting it to obtain the celestial coordinates requires -- what else -- PyWCS also developed at the Space Telescope Institute and now included in AstroPy. <br />
<br />
For example, to access the world coordinate system in a fits file, we would import the module<br />
<br />
from astropy.wcs import WCS<br />
<br />
to have the same namespace as the original PyWCS and library functions for conversion to and from celestial coordinates and pixel coordinates in the image. There are many examples of this in a package of utilities we have developed here<br />
<br />
[http://www.astro.louisville.edu/software/alsvid/index.html Alsvid] Algorithms for Visualization and Processing of Image Data<br />
<br />
<br />
== Other Processing ==<br />
<br />
We have seen that there are many useful basic operations for image processing available simply through NumPy and PyFITS. SciPy adds several others in the [http://docs.scipy.org/doc/scipy/reference/ndimage.html ndimage] package. The functions include image convolution, various averaging or filtering algorithms, Fourier processing, image interpolation, and image rotation.<br />
<br />
<br />
Since images are stored as arrays, there are some simple one-line ways to modify them. Flipping an image top to bottom or left to right is done with<br />
<br />
import numpy as np<br />
np.flipud(image_data)<br />
np.fliplr(image_data)<br />
<br />
While we also have to add lines to read the file, update the header, and write it out again, the program to preform these operations is remarkably short. A template program is given in the examples below.<br />
<br />
Rotating it by 90 degrees is also easy<br />
<br />
np.rot90(image_data)<br />
<br />
Rotated and flipped images may be saved either as FITS image (using PyFITS) or as png or jpg images using <br />
<br />
from scipy.misc import imsave<br />
imsave(filename,rotated_image_data)<br />
<br />
Image rotation by other angles is somewhat more complex since it requires tranforming an image in a way that does not preserve its shape, or even the number of elements. <br />
<br />
There is a simple one-line command that does all this for you<br />
<br />
from scipy.misc import imrotate<br />
angle = 22.5<br />
new_image = imrotate(image_data, angle, interp='bilinear')<br />
<br />
The angle is in degrees. The ''interp'' string determines how the interpolation will be done, with self-explanatory options<br />
<br />
*'nearest' <br />
*'bilinear' <br />
*'cubic' <br />
*'bicubic'<br />
<br />
Finally, for spectroscopy the useful data in an image may be a sum over a region. In that case we could index through the array and explicitly create the sum if needed, but in the ideal case of a spectrum we may want only the sum along a column for each element of a row. For that, the function is<br />
<br />
spectrum = np.sum(image, axis)<br />
<br />
which returns a numpy array that is the sum along the specified axis. If there are regions in the image that should not be included in the sum, then the image could be masked before computing the sum. The result is a 1-d array in which each element is the signal at a different wavelength. <br />
<br />
Sometimes the spectral lines are not along a column but are still straight (if curved, then we need other ways to mask the array), in which case scipy.misc.imrotate can be used before the sum to align the spectrum with a row or column.<br />
<br />
<br />
<br />
<br />
== AstroImageJ and Alsvid ==<br />
<br />
We have developed a collection of Python routines to do many of the routine astronomical image processing tasks such as dark subtraction, flat fielding, co-addition, and FITS header management through PyFITS and PyWCS. The current version of the "Alsvid" package is available for download:<br />
<br />
<br />
[http://www.astro.louisville.edu/software/alsvid/index.html Alsvid]<br />
<br />
<br />
<br />
== Examples ==<br />
<br />
For examples of Python illustrating image processing, see the [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_examples examples] section.<br />
<br />
<br />
<br />
<br />
Alsvid is intended as a command line supplement to the powerful Java program [http://www.astro.louisville.edu/software/astroimagej/index.html AstroImageJ] which provides real-time interactivity with astronomical image processing and precision photometry. AstroImageJ is built on the orirignal [https://imagej.nih.gov/ij/ ImageJ], an image processing program developed at the National Institutes of Health and now maintained as public domain opensource resource. As such, this core component of AIJ offers many specialized tools for image analysis in the biological sciences which are equally useful in Physics and Astronomy. AstroImageJ along side versatile Python desktop processing is a powerful combination for astronomical image analysis.<br />
<br />
<br />
== Assignments ==<br />
<br />
For the assigned homework to use these ideas, see the [http://prancer.physics.louisville.edu/astrowiki/index.php/Python_assignments assignments] section.</div>WikiSysop