You are here

Users Feedbacks

, Updated

Semesterly Short Surveys

The Gemini observatory maintains a direct dialog with its users by sending out routine Short Surveys (2–3 questions) at every critical phase of Gemini’s user programs (see the Table). The effort has several key objectives: 1) monitor the usefulness and usability of our software tools and documentation; 2) determine how well the observations went; and 3) assess how satisfied the Principal Investigator (PI) is with the data. Another objective is to identify actionable items that can improve the whole observing process at Gemini. In brief, the Short Surveys provide a direct way to listen to what is most important to our user community.

Phase Phase I Phase II End of Semester Phase III
Semester A B A B A B A B
Month October April March September August February July January

As the name Short Surveys indicates, the surveys are designed to be short; they should take only a few minutes to complete. Still, for users who want to have more lengthy communications with Gemini staff about the topics covered, the surveys always include one question offering a text box that has no length limit. All of these comments are read by Gemini staff, while anonymity is maintaned.

Here, we present you the main feedbacks that were received and the follow-up actions that they triggered:

Phase I

The questions

  1. Please rate your satisfaction with the proposal preparation process (PIT, ITC, website, documentation, etc.). 1=I hated it; 5=I like it very much
  2. Is there anything you wished would be different?

The most frequent actionable comments

  1. The highest irritation comes from the “Targets” section in PIT and its interaction with the “Observations” and “Band 3” sections. Primarily,
    1. Some users do not realize they can enter a target list by importing a file. Many who knew how to import a target list still could not make it work properly. Finally, some would like the target definition to act differently.
    2. Users are often confused with how to copy and paste, organize and configure targets, and the corresponding resources+weather components under the “Observations” section.
  2. Many proposers are with the way the total requested time is calculated.
  3. One recurrent request is a notification email, or some sort of confirmation when the proposal is submitted.

Follow-up actions

The way targets and time requests will be entered in the future proposal tool (see more on the operations development page) is going to be much more intuitive and straightforward. In the meantime, we clarify how these functions work in the current PIT (see the PIT description for targets selection and time requests) and on the video tutorial.

There is no way we can add automated notifications after a proposal has been submitted, but that is a capability that is planed for the future proposal tool.

Phase II

The questions

  1. Please rate your satisfaction with the Phase II preparation process (OT, website, documentation, support from NGO or Gemini etc.). 1=I hated it; 5=I like it very much
  2. Is there anything you wished would be different?

The most frequent actionable comments

  1. Even though the OT works and performs quite well, it is very hard to figure out its logic. Consequently, when PIs need to change anything (e.g., sequence steps, instrument set-up), they can rarely apply it successfully withouht help from a support staff.
  2. The OT lacks documentation. The videos are helping a little bit, but they are not covering enough topics to replace completely the details that are not covered on the Gemini website. Users’ suggestions to improve documentation include reorganizing the Gemini webpages, or creating new, complete PDF documents.

Follow-up actions

A new took to replace the OT is in the making (see more on the operations development page). Its functions has been developed based on interviews with staff and users, on mock-up scenarios and usability testings. The goal is to transform the way Phase II is prepared into a much more intuitive procedure.

The website has recently changed to allow for the type of content reorgaization that is requested. We hope to converge to a first phase by the semester 2020B for the OT pages and the instruments pages.

End of Semester

The questions

  1. Do you feel your program was treated fairly?
  2. How did the data quality meet your expectations (delivered vs predicted S/N, calibrations, cosmetics, AO performances, etc.)?

The most frequent actionable comments

Many PIs express their disappointment towards their failed expectations regarding the completion rate of their program, as most of those were completed to a 50% level, or lower, at
the time of the survey. The type of programs that were led by unfortunate PIs were:

  1. Band 2 programs
  2. GPI programs
  3. Block scheduled programs (GSAOI, GRACES)
  4. Band 1 programs that did not make much progress during the semester (most of which were completed after the survey was asnwered for their rollover status, or the Band 1 persistency!)
  5. Including missed timing windows

Follow-up actions

The progress of a given program is impacted by many factors. One web page on this site presents many useful statistics that can be useful to understand what to expect once we become PI of a Gemini program. Those include, among other things Weather loss and delivered science nights completion rates for queue programscompletion expectations for queue programs and the effect of program length

For more informations about what to expect, please read the corresponding section on the Start Here page. 

Phase III

The questions

  1. Would you apply for Gemini time again?
  2. Check any of the boxes that correspond to a challenge you have or may soon face before you can publish your data (check all that apply):
    1. Issues to access the data
    2. Unsatisfactory data quality
    3. Unsatisfactory AO correction
    4. Too few data
    5. Issues during data reduction
    6. Insufficient support
    7. Not worth publishing (i.e., non-detection)
    8. Decreased priority of the project
    9. Lack of resources
    10. None! It is all right!
  3. Do you use Gemini Data Reduction tools?

The results

Comments in the Phase III survey repeat for the most part what was covered in the End of Semester survey. On the other hand, answers to question 2 are helping defining the best strategy to improve the publication rate of Gemini data!

Once all the answers between semester 2016B and 2018B (total response rate of 35%) are compiled, we get the following:

The percentage is based on the number of responses within a given Band. Participants could select more than one option, so the sum within a Band can exceed 100%. The labels are:

GOA = Issues to access the data
DQ = Unsatisfactory data quality
AO = Unsatisfactory AO correction
nb obs = Too few data
DR = Issues during data reduction
DR (Gem) = DR, but excluding those who used their own DR tools
support = Insufficient support
not worth = Not worth publishing (i.e., non-detection)
priority = Decreased priority of the project
resources = Lack of resources
None! = None! It is all right!

The most significant improvement that could be made is with data reduction (~25%). Roughly 50% of those who answered DR did not select any other problem, suggesting that fixing DR could poetentially increase the publication rate by ~10-15%. 
The experience that users report back to us is that imaging data reduced using DRAGONS is much easier and straighforward than with the previous IRAF Gemini package. In a near future, DRAGONS will also cover long-slit spectroscopy. The end goal is for DRAGONS to handle the reduction of all the data coming from Gemini Facility instruments.

Long Form Survey

Unfortunately, the long form is not ready yet. Thank you for considering sharing your feedbacks with us. We hope to have it ready soon.