Software:Collective Knowledge

From HandWiki
Short description: Open-source framework for researchers
Collective Knowledge (CK)
Collective Knowledge and cTuning logo.png
Developer(s)Grigori Fursin and the cTuning foundation
Initial release2015; 9 years ago (2015)
Stable release
2.6.3 (discontinued for the new Collective Mind framework[1]) / November 30, 2022 (2022-11-30)
Written inPython
Operating systemLinux, Mac OS X, Microsoft Windows, Android
TypeKnowledge management, FAIR data, MLOps, Data management, Artifact Evaluation, Package management system, Scientific workflow system, DevOps, Continuous integration, Reproducibility
LicenseApache License for version 2.0 and BSD License 3-clause for version 1.0
Websitegithub.com/ctuning/ck, cknow.io

The Collective Knowledge (CK) project is an open-source framework and repository to enable collaborative, reproducible and sustainable research and development of complex computational systems.[2] CK is a small, portable, customizable and decentralized infrastructure helping researchers and practitioners:

Notable usages

Portable package manager for portable workflows

CK has an integrated cross-platform package manager with Python scripts, JSON API and JSON meta-description to automatically rebuild software environment on a user machine required to run a given research workflow.[19]

Reproducibility of experiments

CK enables reproducibility of experimental results via community involvement similar to Wikipedia and physics. Whenever a new workflow with all components is shared via GitHub, anyone can try it on a different machine, with different environment and using slightly different choices (compilers, libraries, data sets). Whenever an unexpected or wrong behavior is encountered, the community explains it, fixes components and shares them back as described in.[4]

References

  1. CK package at PYPI, https://pypi.org/project/ck 
  2. 2.0 2.1 Fursin, Grigori (October 2020). "Collective Knowledge: organizing research projects as a database of reusable components and portable workflows with common APIs". Philosophical Transactions of the Royal Society. 
  3. reusable CK components and actions to automate common research tasks
  4. 4.0 4.1 4.2 Grigori Fursin, Anton Lokhmotov, Dmitry Savenko, Eben Upton. A Collective Knowledge workflow for collaborative research into multi-objective autotuning and machine learning techniques, arXiv:1801.08024, January 2018 (arXiv link, interactive report with reproducible experiments)
  5. Online repository with reproduced results, https://cKnowledge.io 
  6. Index of reproduced papers, https://cKnowledge.io/reproduced-papers 
  7. HiPEAC info (page 17), January 2016, https://www.hipeac.net/assets/public/publications/newsletter/hipeacinfo45.pdf 
  8. Ed Plowman; Grigori Fursin, ARM TechCon'16 presentation "Know Your Workloads: Design more efficient systems!", https://github.com/ctuning/ck/wiki/Demo-ARM-TechCon'16 
  9. Reproducibility of Results in the ACM Digital Library, https://dl.acm.org/reproducibility.cfm 
  10. ACM TechTalk about reproducing 150 research papers and testing them in the real world, https://learning.acm.org/techtalks/reproducibility 
  11. Artifact Evaluation for systems and machine learning conferences, http://cTuning.org/ae 
  12. EU TETRACOM project to combine CK and CLSmith, http://es.iet.unipi.it/tetracom/content/uploads/Posters/TTP35.pdf, retrieved 2016-09-15 
  13. Artifact Evaluation Reproduction for "Software Prefetching for Indirect Memory Accesses", CGO 2017, using CK, 16 October 2022, https://github.com/SamAinsworth/reproduce-cgo2017-paper 
  14. GitHub development website for CK-powered Caffe, 11 October 2022, http://github.com/dividiti/ck-caffe 
  15. Open-source Android application to let the community participate in collaborative benchmarking and optimization of various DNN libraries and models, http://cknowledge.org/android-apps.html 
  16. Reproducing Quantum results from Nature – how hard could it be?, https://www.linkedin.com/pulse/reproducing-quantum-results-from-nature-how-hard-could-lickorish 
  17. MLPerf crowd-benchmarking, https://cknowledge.io/c/solution/demo-obj-detection-coco-tf-cpu-benchmark-linux-portable-workflows 
  18. MLPerf inference benchmark automation guide, 17 October 2022, https://github.com/ctuning/ck/blob/master/docs/mlperf-automation/README.md 
  19. List of shared CK packages, https://cKnowledge.io/packages 

External links

  • Development site: [1]
  • Documentation: [2]
  • Public repository with crowdsourced experiments: [3]
  • International Workshop on Adaptive Self-tuning Computing System (ADAPT) uses CK to enable public reviewing of publications and artifacts via Reddit: [4]