Teaching and learning computer programming is challenging for many undergraduate first-year computer science students. During introductory programming courses, novice programmers need to learn some basic algorithms, gain algorithmic thinking, improve their logical and problem-solving thinking skills, and learn data types, data structures, and the syntax of the chosen programming language. In literature, we can find various methods of teaching programming that can motivate students and reduce students’ cognitive load during the learning process of computer programming, e.g., using robotic kits, microcontrollers, microworld environments, virtual worlds, serious games, interactive animations, and visualizations. In this paper, we focus mainly on algorithm visualizations, especially on the different models of data structures that can be effectively used in educational visualizations. First, we show how a vector (one-dimensional array), a matrix (two-dimensional array), a singly linked list, and a graph can be represented by various models. Next, we also demonstrate some examples of interactive educational algorithm animations for teaching and learning elementary algorithms and some sorting algorithms, e.g., swapping two variables, summing elements of the array, mirroring the array, searching the minimum or maximum of the array, searching the index of minimum or maximum of the array, sorting elements of the array using simple exchange sort, bubblesort, insertion sort, minsort, maxsort, quicksort, or mergesort. Finally, in the last part of the paper, we summarize our experiences in teaching algorithmization and computer programming using algorithm animations and visualizations and draw some conclusions.
To acquire algorithmic thinking is a long process that has a few steps. The most basic level of algorithmic thinking is when students recognize the algorithms and various problems that can be solved with algorithms. At the second level, students can execute the given algorithms. At the third level of algorithmic thinking, students can analyze the algorithms, they recognize which steps are executed in sequences, conditions or loops. At the fourth level, students can create their algorithms. The last three levels of algorithmic thinking are: the implementation of the algorithms in a programming language, modifying and improving the algorithms, and creating complex algorithms. In preliminary research related to algorithmic thinking, we investigated how first-year undergraduate computer science students of J. Selye University can solve problems associated with the second, third and fourth level of algorithmic thinking. We chose these levels because these levels do not require to know any programming language. The tasks that students had to solve were for example: what will be the route of a robot when it executes the given instructions, how many times we need to cross a river to carry everyone to another river-bank. To solve these types of tasks requires only good algorithmic thinking. The results showed that students reached 81.4% average score on tasks related to the execution of given algorithms, 72.3% average score on tasks where they needed to analyze algorithms, and 66.2% average score on tasks where students needed to create algorithms. The latter type of tasks were mostly various river-crossing problems. Even though, that students reached a 66.2% average score on these tasks, if we had accepted only solutions with the optimal algorithms (minimal number of river crossing), they would have reached only a 21.3% average score, which is very low. To help students find the optimal algorithms of river crossing puzzles, we developed several interactive web-based animations. In the last part of this paper, we describe these animations, we summarize how they were created and how they can be used in education. Finally, we conclude and briefly mention our plans related to our future research.
The memory card game is a game that probably everyone played in childhood. The game consists of n pairs of playing cards, whereas each card of a pair is identical. At the beginning of the game, the deck of cards is shuffled and laid face down. In every move of the game, the player flips over two cards. If the cards match, the pair of cards is removed from the game; otherwise, the cards are flipped back over. The game ends when all pairs of cards have been found. The game could be played by one, two, or more players. First, this paper shows an optimal algorithm for solving a single-player memory card game. In the algorithm, we defined four steps where the user needed to remember the earlier shown pairs of cards, which cards were already shown, and the locations of the revealed cards. We marked the memories related to these steps as M1, M2, M3, and M4. Next, we made some simulations as we changed the M1, M2, M3, and M4 memories from no user memory (where the player does not remember the cards or pairs of cards at all) to a perfect user memory (where the player remembers every earlier shown card or pair of cards). With every memory setting, we simulated 1000 gameplays. We recorded how many cards or pairs of cards the player would need to remember and how many moves were required to finish the game. Finally, we evaluated the recorded data, illustrated the results on graphs, and drew some conclusions.
Abstract- The role of support vector machine in the evaluation of English teaching effect is very important, but there is a problem of inaccurate evaluation of results. The traditional English teaching mode cannot solve the accuracy and efficiency of the effect evaluation of students' English teaching and cannot meet the requirements of English teaching effect evaluation. Therefore, this paper proposes a neural network algorithm to innovate and optimize the analysis of support vector machines. Firstly, the relevant theories are used to construct a multi-index English teaching effect evaluation system with teachers and students as the main body, and the indicators are divided according to the data requirements of English teaching effect evaluation indicators to reduce the support vector machine in the interfering factor. Then, the neural network algorithm is used to solve the optimal solution of kernel function parameters and regularization parameters of the support vector machine, and the support vector machine scheme is formed, and the support vector machine results are carried out Comprehensive analysis. MATLAB simulation shows that the evaluation accuracy of the English teaching effect of the neural network algorithm and the support vector machine under certain evaluation criteria Optimal, short evaluation time.
Pneumonia is an acute pulmonary infection that can be caused by bacteria, viruses, or fungi. It infects the lungs, causing inflammation of the air sacs and pleural effusion: a condition in which the lung is filled with fluid. The diagnosis of pneumonia is tasking as it requires a review of Chest X-ray (CXR) by specialists, laboratory tests, vital signs, and clinical history. Utilizing CXR is an important pneumonia diagnostic method for the evaluation of the airways, pulmonary parenchyma, and vessels, chest walls among others. It can also be used to show changes in the lungs caused by pneumonia. This study aims to employ transfer learning, and ensemble approach to help in the detection of viral pneumonia in chest radiographs. The transfer learning model used was Inception network, ResNet-50, and InceptionResNetv2. With the help of our research, we were able to show how well the ensemble technique, which uses InceptionResNetv2 and the utilization of the Non-local Means Denoising algorithm, works. By utilizing these techniques, we have significantly increased the accuracy of pneumonia classification, opening the door for better diagnostic abilities and patient care. For objective labeling, we obtained a selection of patient chest X-ray images. In this work, the model was assessed using state-of-the-art metrics such as accuracy, sensitivity, and specificity. From the statistical analysis and scikit learn python analysis, the accuracy of the ResNet-50 model was 84%, the accuracy of the inception model was 91% and lastly, the accuracy of the InceptionResNetv2 model was 96%.
Sentiment analysis and opinion mining is a branch of computer science that has gained considerable growth over the last decade. This branch of computer science deals with determining the emotions, opinions, feelings amongst others of a person on a particular topic. Social media has become an outlet for people to voice out their thoughts and opinions publicly about various topics of discussion making it a great domain to apply sentiment analysis and opinion mining. Sentiment analysis and opinion mining employ Natural Language Processing (NLP) in order to fairly obtain the mood of a person’s opinion about any specific topic or product in the case of an ecommerce domain. It is a process involving automatic feature extractions by mode of notions of a person about service and it functions on a series of different expressions for a given topic based on some predefined features stored in a database of facts. In an ecommerce system, the process of analyzing the opinions of customers about products is vital for business growth and customer satisfaction. This proposed research will attempt to implement a model for sentiment analysis and opinion mining on Twitter feeds. In this paper, we address the issues of combining sentiment classification and the domain constraint analysis techniques for extracting opinions of the public from social media. The dataset that was employed in the paper was gotten from Twitter through the tweepy API. The TextBlob library was used for the analysis of the tweets to determine their sentiments. The result shows that more tweets were having a positive subjectivity and polarity on the subject matter.
Currently, the use of internet-connected applications for storage by different organizations have rapidly increased with the vast need to store data, cybercrimes are also increasing and have affected large organizations and countries as a whole with highly sensitive information, countries like the United States of America, United Kingdom and Nigeria. Organizations generate a lot of information with the help of digitalization, these highly classified information are now stored in databases via the use of computer networks. Thus, allowing for attacks by cybercriminals and state-sponsored agents. Therefore, these organizations and countries spend more resources analyzing cybercrimes instead of preventing and detecting cybercrimes. The use of network forensics plays an important role in investigating cybercrimes; this is because most cybercrimes are committed via computer networks. This paper proposes a new approach to analyzing digital evidence in Nigeria using a proactive method of forensics with the help of deep learning algorithms - Convolutional Neural Networks (CNN) to proactively classify malicious packets from genuine packets and log them as they occur.
Due to the rapid growth in the field of science and technology, IoT (Internet of Things) has become emerging technique for connecting heterogeneous technologies related to our daily needs that can affect our lives tremendously. It allows the devices to be connected to each other and controlled or monitored through handheld devices. The IoT network is a heterogeneous network that links several small hardware restriction devices, and where conventional security architectures and techniques cannot be used. So, providing protection to the IoT network involves a diverse range of specialized techniques and architectures. This paper focuses on the requirements of defense, current state of the art and future directions in the field of IoT.
We designed a mobile application to deal with Ischemic Heart Disease (IHD) (Heart Attack) An Android based mobile application has been used for coordinating clinical information taken from patients suffering from Ischemic Heart Disease (IHD). The clinical information from 787 patients has been investigated and associated with the hazard factors like Hypertension, Diabetes, Dyslipidemia (Abnormal cholesterol), Smoking, Family History, Obesity, Stress and existing clinical side effect which may propose basic non-identified IHD. The information was mined with information mining innovation and a score is produced. Effects are characterized into low, medium and high for IHD. On looking at and ordering the patients whose information is acquired for producing the score; we found there is a noteworthy relationship of having a heart occasion when low and high and medium and high class are analyzed; p=0.0001 and 0.0001 individually. Our examination is to influence straightforward way to deal with recognize the IHD to risk and careful the population to get themselves assessed by a cardiologist to maintain a strategic distance from sudden passing. As of now accessible instruments has a few confinements which makes them underutilized by populace. Our exploration item may decrease this constraint and advance hazard assessment on time.
Malwares are one of the most dangerous security threats in today’s world of fast growing technology. Now, it is not impossible to remotely lock down a system’s files for ransoms even when it is located overseas. This threat was accelerated when the world was introduced to cryptocurrency (for e.g., Bitcoins). It allowed the attackers to hide their tracks more efficiently. From a simple idea of testing the efficiency of a computer system to the most critical and sophisticated cyber-attack, malwares has evolved over the years and appeared time to time. Even with the smartest technologies today where we are trying to include Machine learning and Deep learning to every field of our life, the attackers are already developing more sophisticated malwares using the same Machine learning and Deep learning techniques. This raises the question on the security of the cyber-world and how we are able to protect it. In this work, we are presenting an analysis on a recent and most critical Windows malware called “LockerGoga”. Both static and dynamic analyses are performed on the malware to understand the behavior and characteristics of the malware.
Library automation is just not a book inventory where hold, issue and receiving of books by using technological tools and services. Our applied research was found that most of the library administrative functionalities such as ‘Acquisition and Accessioning’, ‘auto Indexing & Classification’ and auto Cataloging (Books & Non books materials)’, inventory with real-time OPAC facilities and many more library science concepts are still missing constructs in Universities/College libraries across the country. In this modern era the concept of eLibrary is more popular because availability and accessibility of digitized content sharing through IT/ICT infrastructure is huge. During our research we found that cognizance of Library automation was completely ignored and focused on only talking and establishment of eLibrary. As we all know that “Physical Library” is not a substitute for “eLibrary”. In fact eLibrary is part of a Physical Library to share authenticated digitized content through IT/ICT infrastructure. After a decade of our applied research in the area of Library science, eventually we recorded a lot of findings based on our survey and discussion with senior researchers and Librarians. Our serious and consistent effort makes to succeeds in designing comprehensively effective and efficient operational strategies to build a “world class Library Automation and Paper less Library Management System” for Universities/College libraries. This paper emphasizes about the comprehensive real-time architecture and operational modules and their effectiveness to achieve the user’s satisfaction (flow of functionalities as per the exact need of the Library Management system). This dealt with how emerging technological tools and services are effectively integrated for designing new strategies in the area of library science includes various automation process and security concepts (using Barcode/RFID). Eventually, our dream comes true in building Use of Emerging Technological Tools and Services to building world class Paper less Library Information &Management System [LIMS]. presently deployed and use of this software product in more than 300 satisfied and client locations in INDIA, this product popularly named as “eLib” by AarGees Business Solution, Hubli, India. Though, our research is still on and continuing for further development to build “Global knowledge sharing Centre”.
This study investigated utilization of information communication technology for the improvement of personnel economics in the administration of public secondary schools in Rivers state. The study had two objectives, with corresponding research questions and hypotheses. A descriptive survey design was employed, and the population consisted of 11,258 secondary school teachers from 258 public senior secondary schools in Rivers state, with 4,127 males and 7,131 females. A sample of 383 teachers (163 males and 220 females) was drawn from 15 public senior secondary schools using the Taro Yamane Formula and a two-stage sampling technique of stratified and simple random sampling. Data was collected using a self-structured questionnaire titled "Utilization of Information and Communication Technology and Personnel Economics in Secondary School Administration." The questionnaire underwent face and content validation by three experts and demonstrated good reliability with a Cronbach Alpha coefficient of 0.82. Research questions were answered using mean and standard deviation, while inferential statistics utilized the z-test. The findings indicated a significant difference between male and female teachers in their perceptions of ICT utilization for teachers' supervision and evaluation, highlighting the potential for ICT to improve personnel economics in the administration of public secondary schools in Rivers state. Based on these findings, it can be concluded that implementing various ICT-based strategies such as developmental supervision, contextual supervision, clinical supervision, and collaborative forms of developmental supervision could enhance the effectiveness of teaching staff and overall school productivity in public secondary schools in Rivers state.
This article explores the intersection of Artificial Intelligence (AI) and personalized healthcare, focusing on how AI-driven technologies are revolutionizing the delivery of individualized medical care. Through an in-depth examination of recent advancements, practical implementations, and ethical considerations, this paper elucidates the transformative role of AI in tailoring treatments, improving diagnostic precision, and enhancing overall patient outcomes.
This article explores the intersection of Artificial Intelligence (AI) and predictive analytics in healthcare, focusing on the transformative role of AI-based predictive analytics in enabling proactive and personalized patient care. Through an in-depth analysis of recent advancements, practical implementations, and ethical considerations, the paper elucidates how these technologies contribute to early intervention, improved patient outcomes, and enhanced healthcare efficiency.
The “Green AI Revolution” distils a paradigm-shifting methodology for creating machine learning solutions for the design and enhancement of ecologically sustainable communication networks. To address sustainability concerns in communication infrastructures, this study presents a comprehensive architecture that emphasises the integration of machine learning (ML) and artificial intelligence (AI) techniques. With the fitting moniker “Green AI”, the suggested model aims to improve overall resource efficiency in communication networks while minimising energy usage and carbon footprints. The goal of Green AI is to transform conventional communication systems by utilising sophisticated algorithms, dynamic optimisation, and intelligent decision-making techniques. Higher energy efficiency, less of an impact on the environment, and better network performance are the main goals. The present study examines the fundamental elements of the Green AI architecture, encompassing intelligent routing, dynamic power management, and adaptive power distribution of resources. Furthermore, case studies and simulations highlight the real advantages of incorporating machine learning into communication networks, highlighting the technology’s potential to make a substantial contribution to a future that is more environmentally friendly and sustainable. The Green AI Revolution is a paradigm shift in the way we think about and use communication technology. It encourages innovation that is in line with environmental stewardship and technical progress.
Department Of Mathematics, National University Of Skills (nus), Tehran, Iran.
Police Academy, Egypt