SciPy 2018 Chair and Reviewer Guidelines
The Program Committee’s job is to select the talks that maximize the benefit to the entire conference audience, within the space and time constraints given.
SciPy 2018 will implement a double-open review: authors and reviewers are known to each other. Area chairs will make acceptance decisions, informed by reviewer comments and conference schedule limits. The intention is to offer transparency in the process, building trust in the community. (There is also research attesting to higher quality in signed reviews, compared to anonymous reviews.) But, research shows that peer review can suffer from implicit bias—just like the hiring processes—that disadvantages women, minorities and people from less-prestigious institutions. There is also research attesting to higher quality in signed reviews compared to anonymous reviews[1],[2],[3] . For this reason, we want to educate our reviewers about bias in the review process, and equip them with tools to interrupt this bias. Bias in the review process
Human beings are consistently, routinely, and profoundly biased. We not only are profoundly biased, but we also almost never know we are being biased. (Interested?[4],[5])
First, by realizing and accepting that we all have bias, we can learn to watch for it in ourselves and help others who work with us to do the same. This process of building awareness is analogous to what happens when we step on the clutch in a manual-transmission automobile. The motor doesn’t stop running (bias doesn’t stop), but the car is no longer moving forward. When we are on the lookout for our biases, they are less likely to blindly dictate our decisions.[6] Be on the lookout! As you review, ask yourself … Are there things about the proposal that particularly influence my impression? Are they relevant to the talk? What assessments have I already made about the speaker? Are these grounded in solid information or are they simply my interpretations? Does this person’s work remind me in any way about myself? Is my own agenda influencing my assessment of this proposal? Are there past experiences of mine that are influencing factors? Things to think about as you prepare to review: Bias in pattern-recognition responses—Does one person benefit because they do things “the way we do it around here,” rather than because it is the most innovative, productive, and effective way? Pay attention to your projections about the work being evaluated. Look for patterns of assessment among different groups. For example, are women in general rated differently than men? Look out for Semantic gender priming: exposure to words more strongly associated with male (e.g., aggressive, competitive) or female (e.g., supportive, nurturing) stereotypes affects subsequent evaluation of male or female targets.[7]
Tips for the lead chairs:Check established metrics to balance “gut” reactions. Be sure that what gets measured is indicative of performance. Review the conversation about unconscious bias before the review process begins. Use decision aids prior to performance review sessions to identify and navigate biases related to work styles, interpersonal traits, personal relationships, assumptions about feelings, lifestyle, teamwork, or personal goals, and who has influenced your interpretation. Tips for the reviewer:Interrupt your bias—refresh yourself after each section as to what rubric/metrics you’ve established and make sure that you’re reviewing a submission accordingly. Be polite. Instead of telling, ask new ways of approach. Don’t be rude. Always say at least one good thing. Be grateful. Point out what you learned and say thanks. Stay fresh while reading. Long time reviews might cause some spots to be left uncovered.
Other interesting links on the topic about gender bias: Threats to objectivity in peer review: the case of gender A Linguistic Comparison of Letters of Recommendation for Male and Female Chemistry and Biochemistry Job Applicants Science faculty’s subtle gender biases favor male students How stereotypes impair women’s careers in science You should analyze the presentation in the aspects described below, thanks to Doug Hellmann for these tips: The Abstract: Occasionally a title comes along that is so compelling that I have to remind myself to keep reading before voting +1 and moving on to the next talk. It isn’t enough for a proposal to cover an interesting topic. It has to indicate that the talk will be interesting, too. While I am reading, I look for several factors. Is the abstract clear? The speaker should describe the topic they plan to talk about in terms I can understand, even if I don’t know anything about that subject area. A clearly written abstract, without a lot of domain-specific jargon, tells me the speaker will be able to communicate with the audience.
How relevant, immediately useful and novel is the topic? I look first at whether the topic is relevant to the conference attendees. Attendees will have a range of experience levels, interests, and backgrounds. Although we want a broad set of topics, we do need to be careful to avoid talks that are too narrowly focused on a niche. I often recommend that new projects which show a lot of promise convert their talk proposal into a poster proposal. As a counterpoint to considering whether a topic is too niche, I also try to take into account whether the audience will take away something immediately useful. Prepared by the SciPy 2017 Diversity Committee and Program Chairs. Thanks to Philip B. Stark for useful links. This document is a compilation from different sources listed below: https://doughellmann.com/blog/2011/10/18/how-i-review-a-pycon-talk-proposal/ https://nearsoft.com/blog/europython-2016-a-review/ Everyday Bias: Identifying and Navigating Unconscious Judgments in Our Daily Lives. Howard J. Ross. https://hbr.org/2015/04/3-ways-to-make-less-biased-decisions https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4238945/#!po=88.4615 [1]http://bmjopen.bmj.com/content/5/9/e008707 [2] http://bjp.rcpsych.org/content/176/1/47/full-text.pdf+html [3] https://www.ncbi.nlm.nih.gov/pubmed/28580134/ [4] https://www.amazon.com/Everyday-Bias-Identifying-Navigating-Unconscious/dp/1442230835 [5] https://implicit.harvard.edu/implicit/takeatest.html [6] https://hbr.org/2015/04/3-ways-to-make-less-biased-decisions [7] Threats to objectivity in peer review: the case of gender |