Quality Issues for Research in Circuits and Systems
Have you ever read or reviewed a paper, where you missed essential data in order to understand or reproduce the results? Were you ever frustrated that the authors of the paper you were reading did not test and evaluate their designs on more than one case or only reported about the cases where it worked well?
As the EXCOM and BOG of CAS Society we feel that we have the responsibility to work on good practices and guidelines for guaranteeing maximal quality and value of the publications. So I would like to discuss with you some initiatives on this issue and ask for your cooperation on it.
Let us first discuss initiatives that can improve the quality and value of publications. First of all modern web technology provides us a channel for more information from the author to the readers and the reviewers than the paper itself. This can typically be additional data, signals, measurements, test cases, tables, or the computer code, and the experimental configurations. Often this information is rather boring and not so important to be included in the main text or appendix and hence for page limitations it is often eliminated. However this information is often essential to reproduce the findings of the paper. Such additional information allows for complete reproducibility of the findings in papers describing and studying new theory, algorithms and computer programs. This is usually called "reproducible research". In fact a number of research groups [1,2] at Stanford and EPFL Lausanne are already systematically implementing it in their work. Our sister society the IEEE Signal Processing Society has also taken a number of actions and initiatives for promoting the idea and studying the implementation with a special session at ICASSP07 [3]. However for experimental setups, and hardware implementations of designs this additional information is not sufficient to verify it completely. Here a special session with hardware implementations like that at ISCAS07 is very valuable. For such physical and experimental setups one cannot talk about reproducibility but about a weaker form of reusability. There are additionally other mechanisms for increasing the value for the readers and those are common targets for the design of algorithms, hardware, or systems. One can distinguish here benchmark design problems, or challenges, and competitions. The competitions distinguish from the benchmarks in the sense that competitions have a common deadline for the submissions and this allows to keep some information or results unknown to the participants until the deadline. When the submissions are then revealed at the workshop or conference they can be compared among each other and also with the optimal solution if it exists. Of course one should in advance agree also on the ways the quality of the submitted designs will be measured. Both benchmark problems and competitions should be based on a commonly accepted target that is considered meaningful and valuable by the scientific community. Typically such benchmarks or competitions can be set up by our Technical Committees, and are certainly encouraged for ISCAS 2008. CEDA has already made programming challenges [4].
Now let us discuss the actions that we can take on these mechanisms. First of all let us mention here that it is certainly not the intention to make the reproducible research mandatory or force all research papers to deal with benchmark problems or competitions. There are several reasons for that. One is the fact that we need a common agreement about the value of these targets or mechanisms, and that confidence takes time to grow in our scientific community. Moreover the creativity in setting up new design problems should be stimulated and the design problems can only be set up when a problem is considered to be widely accepted. Third several industrial researchers may have intellectual property problems to publish the additional material for reproducible research. It is certainly not the intention to refrain industrial researchers from publishing in the CAS Society journals rather on the contrary.
As concrete actions for our Society members I propose:
- reflect and discuss in your research environment these instruments and let us know your ideas, thoughts and concerns.
- make proposals for special sessions at ISCAS on the ideas of reproducible research, benchmark designs and competitions
- make proposals for life demo sessions at ISCAS
It would be nice that I could have your comments by October 1, 2007, so that we can discuss it at our November BOG meeting, and have interesting activities already for the participants of ISCAS 2008.
References:
[3] J. Kovacevic, How to Encourage and Publish Reproducible Research, Proc. IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 4, pp. 1273-1276, 2007.
Joos Vandewalle, VP Technical Activities, IEEE CAS Society (Email: Joos.Vandewalle@esat.kuleuven.be)