2020 overview by QED Software Team
The past year was unusual for many reasons. What are the most innovative achievements in the area of AI, ML and Big Data of 2020? What will 2021 bring? These questions are answered by QED Software specialists: Piotr Biczyk, Daniel Kałuża, Alina Powała and Maciej Świechowski. Achievements in 2020 in the area of machine learning and artificial intelligence MACIEJ ŚWIECHOWSKI: Although a lot has happened in 2020, there is no clear winner in my personal ranking of the most pertinent breakthroughs of the year. Of course, the efforts related to data analysis around SARS-CoV-2 cannot be overlooked, but outside the pandemic, I would like to highlight two topics. The first is advances in image processing and computer vision: “Deep Residual Learning for Image Recognition” was the highest cited work in computer science in 2020. The results obtained by nVidia in the field of GAN (Generative Adversarial Networks) also deserve attention:…
QED Software on IEEE BigData 2020
IEEE BigData coming soon This year IEEE BigData 2020 taking place on-line from Atlanta (USA). Virtually, but organized with the same commitment, the conference will take place in mid-December. The IEEE International Conference on Big Data 2020 provides a leading forum for disseminating the latest research in Big Data. IEEE Big Data brings together leading researchers and developers from academia, research, and the industry from all over the world to facilitate innovation, knowledge transfer, and the technical progress in addressing the 5 V’s (Velocity, Volume, Variety, Value and Veracity) of Big Data. The purpose of the conference is to identify the deep technical and scientific nature of big data problems and share the future direction on the development of next-generation solutions for data-driven decision making. The conference will attract high-quality theory and applied research findings in big data science and foundations, big data infrastructure, big data management, big data search…
The current fascination with artificial intelligence is not something new. During the 60 years that have elapsed since the creation of the first machine to recreate the human thinking process, there have been periods of growth and decline in enthusiasm for this technology. The two “AI winters” that have happened since then proved to be an important lesson for the worlds of science and business alike. Let me start with a short disclaimer — I am aware that the term artificial intelligence is a huge mental shortcut. For the purposes of this article, I use it as a convenient umbrella term that covers all the techniques that reproduce the way people reason and make decisions. The challenges of 60 years ago and the first AI winter The term artificial intelligence was first used in 1956. Three years later, attempts were made to create a machine that would at least partially…