1808
Comment:
|
3733
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
This page provides details about the ESA 2018 Track B Experiment. ESA (European Symposium on Algorithms) is one of the premier conferences on algorithms. It has two tracks: Track A (Design and Analysis) and Track B (Engineering and Applications). The basic setup of the experiments is as follows: there will be two separate PCs for Track B, which will independently review all the submissions to Track B and independently decide on a set of papers to be accepted. After the PCs have done their work, the results will be compared both quantitatively (e.g., overlap in sets of accepted papers) and qualitatively (e.g., typical reasons for differences in the decisions of the two PCs). The results of this comparison will be published. Depending on the outcome, the set of accepted papers for Track B will either be the union of the set of accepted papers from the two independent PCs, or there will be a final round (outside of the experiment) discussing the submissions, where the two PCs reached different conclusions. | This page provides details about the ESA 2018 Track B Experiment. ESA (European Symposium on Algorithms) is one of the premier conferences on algorithms. It has two tracks: Track A (Design and Analysis) and Track B (Engineering and Applications). The basic setup of the experiment is as follows: there will be two separate PCs for Track B, which will independently review all the submissions to Track B and independently decide on a set of papers to be accepted. After the PCs have done their work, the results will be compared both quantitatively (e.g., overlap in sets of accepted papers) and qualitatively (e.g., typical reasons for differences in the decisions of the two PCs). The results of this comparison will be published. Depending on the outcome, the set of accepted papers for Track B will either be the union of the set of accepted papers from the two independent PCs, or there will be a final round (outside of the experiment) discussing the submissions, where the two PCs reached different conclusions. |
Line 3: | Line 3: |
<<TableOfContents(2)>> | <<BR>><<TableOfContents(2)>> |
Line 7: | Line 7: |
Both PCs have 12 members. Both have the same PC chair. The complete list of members can be found here: http://algo2018.hiit.fi/esa/#pc . The PCs have an identical distribution with respect to topic, age group, gender, and continent. Namely, each PC now has the following properties: | Both PCs have 12 members. Both have the same PC Chair. The complete list of members can be found here: http://algo2018.hiit.fi/esa/#pcb . The PCs have been set up so as to have an identical distribution with respect to topic, age group, gender, continent in the following sense. The topics are only a rough categorization of what the respective PC members are working on (many work on more than one topic, and topics are not that clear cut anyway). |
Line 9: | Line 9: |
* Gender: 8 men, 4 women * Age group: 2 junior (PhD <= 5 years ago), 4 relatively junior (PhD <= 10 years ago), 6 senior * Continent: 8 Europe, 4 Americas (I tried Asia but wasn't successful) * Topic: 1 parallel (junior), 2 strings (one less senior, one more senior), 2 geometry (one junior, one senior), 2 operations research (one junior, one senior), 5 algorithms in general (three junior, two senior) |
{{{#!html <table style="color: darkblue"> <tr><td>Gender:</td><td>8 x male, 4 x female</td></tr> <tr><td>Age Group:</td><td>2 x junior (PhD <= 5 years ago), 4 x relatively junior (PhD between 5 and 10 years ago), 6 x senior</td></tr> <tr><td>Continent:</td><td>8 x Europe, 4 x Americas (we tried Asia, but weren't successful, sorry for that)</td></tr> <tr><td>Topic:</td><td>1 x parallel algorithms (junior), 2 x string algorithms (one less senior, one more senior), 2 x computational geometry (one junior, one senior), 2 x operations research (one junior, one senior), 5 x algorithms in general (three junior, two senior)</td></tr></table> }}} |
Line 14: | Line 17: |
= Reviewing Algorithm = | = Reviewing "Algorithm" = |
Line 16: | Line 19: |
== | The reviewing algorithm is essentially the same as in previous years. Because of the experiment and because it's a good idea anyway, we try to specify it beforehand. However, this is not a 100% complete and precise specification of the process. The goal is to be as specific as possible without making the description overly complicated or impractical. We will fill in the gaps and fix problems in a reasonable way as we go along, taking care that we treat both PCs equally. As far as the experiment is concerned, these conditions are not perfect, but they are reasonable given the complexity of the process and the agents involved. The total time for the reviewing process (from the submission deadline to the author notification) is 8 weeks. The reviewing process proceeds in the following phases: {{{#!html <p style="color: darkblue"> 1. The deadline for submissions is April 22 AoE (strict)<br/> 2. Bidding and paper assignment: 1 week (~ April 23 - April 29)<br/> 3. Reviewing: 4 weeks (~ April 30 - May 27)<br/> 4. Discussion and recalibration of reviews: 2 weeks (~ May 28 - June 10)<br/> 5. Buffer for things going wrong or taking longer than expected: 1 week<br/> 6. The notification deadline is June 18 (maybe earlier)</p/ }}} The requirements for the reviews are described here: http://ad-wiki.informatik.uni-freiburg.de/research/ESA2018Experiment/Reviews The discussion phase is described here: http://ad-wiki.informatik.uni-freiburg.de/research/ESA2018Experiment/Discussion |
This page provides details about the ESA 2018 Track B Experiment. ESA (European Symposium on Algorithms) is one of the premier conferences on algorithms. It has two tracks: Track A (Design and Analysis) and Track B (Engineering and Applications). The basic setup of the experiment is as follows: there will be two separate PCs for Track B, which will independently review all the submissions to Track B and independently decide on a set of papers to be accepted. After the PCs have done their work, the results will be compared both quantitatively (e.g., overlap in sets of accepted papers) and qualitatively (e.g., typical reasons for differences in the decisions of the two PCs). The results of this comparison will be published. Depending on the outcome, the set of accepted papers for Track B will either be the union of the set of accepted papers from the two independent PCs, or there will be a final round (outside of the experiment) discussing the submissions, where the two PCs reached different conclusions.
Selection of the two PCs
Both PCs have 12 members. Both have the same PC Chair. The complete list of members can be found here: http://algo2018.hiit.fi/esa/#pcb . The PCs have been set up so as to have an identical distribution with respect to topic, age group, gender, continent in the following sense. The topics are only a rough categorization of what the respective PC members are working on (many work on more than one topic, and topics are not that clear cut anyway).
Gender: | 8 x male, 4 x female |
Age Group: | 2 x junior (PhD <= 5 years ago), 4 x relatively junior (PhD between 5 and 10 years ago), 6 x senior |
Continent: | 8 x Europe, 4 x Americas (we tried Asia, but weren't successful, sorry for that) |
Topic: | 1 x parallel algorithms (junior), 2 x string algorithms (one less senior, one more senior), 2 x computational geometry (one junior, one senior), 2 x operations research (one junior, one senior), 5 x algorithms in general (three junior, two senior) |
Reviewing "Algorithm"
The reviewing algorithm is essentially the same as in previous years. Because of the experiment and because it's a good idea anyway, we try to specify it beforehand. However, this is not a 100% complete and precise specification of the process. The goal is to be as specific as possible without making the description overly complicated or impractical. We will fill in the gaps and fix problems in a reasonable way as we go along, taking care that we treat both PCs equally. As far as the experiment is concerned, these conditions are not perfect, but they are reasonable given the complexity of the process and the agents involved.
The total time for the reviewing process (from the submission deadline to the author notification) is 8 weeks. The reviewing process proceeds in the following phases:
1. The deadline for submissions is April 22 AoE (strict)
2. Bidding and paper assignment: 1 week (~ April 23 - April 29)
3. Reviewing: 4 weeks (~ April 30 - May 27)
4. Discussion and recalibration of reviews: 2 weeks (~ May 28 - June 10)
5. Buffer for things going wrong or taking longer than expected: 1 week
6. The notification deadline is June 18 (maybe earlier)
The requirements for the reviews are described here: http://ad-wiki.informatik.uni-freiburg.de/research/ESA2018Experiment/Reviews
The discussion phase is described here: http://ad-wiki.informatik.uni-freiburg.de/research/ESA2018Experiment/Discussion