The Honourable Bruce Cohen and his staff are faced with a vast array of submitted data plus many possible other sources of related public data, referenced but not submitted, from which they need to try and determine, in an unbiased manner, what are the relevant patterns that exist in all the data that can guide them in its deliberations and recommendations that are evidence based.
I strongly suspect that some key relationships between various data variables have been overlooked. One example where this can arise is due to the simplicity of the models used by DFO to manage future salmon runs. They make a lot of simplifying assumptions in order to obtain a tractable model which in turn results in making very inaccurate and less than useful predictions about future salmon runs.
The Cohen Commission must direct DFO and others to improve their prediction performance and use new statistical tools such as the "Maximally Information-based Nonparametric Exploration" (MINE) that was collaboratively developed and tested at MIT, Harvard and Oxford. It can be used as hypothesis generator to explore new ideas and connections that no one thought to look for before. It looks for clear structure in the data and attempts to find them all. It is useful in identifying and characterizing structure in data.
Yours hoping for a maximally informed Cohen Commission,
Jim Ronback, P.Eng. (Retired)
1530 Kirkwood Road
Delta BC V4L 1G1
Tool detects patterns hidden in vast data sets, December 15, 2011
Detecting Novel Associations in Large Data Sets, David N. Reshef et al,
Science 16 December 2011:
Vol. 334 no. 6062 pp. 1518-1524
Materials/Methods, Supporting Text, Tables, Figures, and/or References
Overview of Fraser River Sockeye Salmon Harvest Management, Nov 9, 2010