Showing posts with label machine. Show all posts
Showing posts with label machine. Show all posts

Thursday, February 20, 2025

Discover the Secrets of How the C Programming Language Compiler Works, and Learn How Your Code Becomes Executable

A compiler is a software tool used to translate source code written in a programming language into executable code that can be run on a computer. This is a process that involves several stages, including lexical analysis, syntax analysis, and semantic analysis of the source code. The compiler then generates executable code that is ready to run. While simple language can be used to explain compilers, understanding their operation and translation process can be essential for software engineers and programmers who write software. That's why we separated this topic from the previous post, see here; and gave it special significance.

The first good question you would probably have for us is, in what program is the compiler for the C programming language written? If we go back a little in history, we know that older computers mostly used assembly language, while higher-level programming languages began to develop when the benefits of reusing software on different processors increased. The first higher-level programming language, Plankalkül, was proposed as early as 1943. Since then, several experimental compilers have been developed. Fortran's team, led by John Backus of IBM, introduced the first complete compiler in 1957. Since then, compilers have become increasingly complex as computer architectures have evolved.

Today, it is common practice to implement compilers in the same language that is being compiled. Therefore, it is assumed that the compiler of the C programming language is coded in the C programming language, as for example all .Net programming languages have an open-source Microsoft compiler called Roslyn which is written in the C# programming language. However, to create the first C compiler, its creator Dennis Ritchie used the previous programming language B, which was developed by Ken Thompson.

Compiling files written in the C programming language is not uncommon even in the most modern corporations

Compiling files written in the C programming language is not uncommon even in the most modern corporations

Dennis Ritchie later expanded the B programming language and created the C programming language, so the original C compiler was also written in B. We mostly use GCC compiler version 14 or newer, this text was written in 2025; and there is no theoretical chance that you will find any command in the B programming language in it. Because the GCC compiler does not support the B programming language, it considers it too obsolete. But we know that the GCC compiler is written in a combination of the C and C++ programming languages, with the possibility that it may also contain some parts written in other programming languages such as Objective-C and some newer ones.

When you flawlessly write C code in any text editor and create a text file, you can call the C compiler to translate it into machine code so that your program can run. The compiler runs a translator or translation unit, known as a Translation Unit, which consists of the source file and header files that are referenced using #include directives. If your code is correct, the translator creates an Object File, which we recognize by the. o or .obj suffix, and we call such object files modules. The standard library of the C programming language contains translated object files in machine language, which allows faster access to standard functions that we call in our programs.

It is important to note that when we say that a file is translated into machine language in the C programming language, it is first translated into assembly programming language in a temporary file, which is then translated into machine language, after which the temporary file is deleted. When compiling a program, we recognize such a file by the . s suffix. The translator separately translates each source file with all the header files it contains into separate object files, i.e., modules. The translator then calls the Linker, which combines all object files and all used functions from the library into an Executable File. Do not confuse this process with .Net technology. In .Net technology and the C# programming language, things are different.

How the C Programming Language 'Understands' Your Code: A Journey Through the Compilation Stages

Sunday, September 15, 2024

Machine Learning, How Computers Learn on Their Own?

If you're looking to get started with machine learning, this post is a great place to begin, considering that most people interested in this field easily give up at the beginning due to complex explanations, disorientation in many artificial intelligence structures, numerous unknown technical terms, programming code, and everything else that simply makes them lose their will and wonder what machine learning is in the first place. Therefore, this text aims to provide you with the simplest way and a friendly introduction to machine learning for those who are new to this field. We'll break down complex concepts into simpler terms, so you can easily grasp the core ideas without getting overwhelmed by technical jargon. To start, let's discuss the primary goal of machine learning.

The goal of machine learning is to enable computers to learn from data and make decisions or predictions based on that knowledge, without the need for explicit programming of every step. The idea is for machine learning algorithms to recognize patterns, structures, and relationships within data so that they can:

  • Predict future outcomes: Algorithms can use existing data to predict what will happen in the future, such as changes in prices, user behavior, or risk in financial sectors.
  • Classify data: The goal may be for the algorithm to classify data into different categories, such as recognizing objects in images (like facial or object recognition), identifying diseases based on medical scans, or sorting emails into spam and legitimate categories.
  • Automate processes: Machine learning enables the automation of processes and real-time decision making, such as movie recommendations, speech recognition, self-driving car control, and personalized advertising.
  • Improve accuracy and efficiency: Through learning from data, systems can improve their accuracy in recognizing and classifying objects, reducing the need for human intervention.

Children are taught machine learning in a playful way

Children are taught machine learning in a playful way

Some people think that artificial intelligence is the same as machine learning, but in essence, AI is a broader field that encompasses various techniques, while machine learning is a specific subset of AI that focuses on enabling machines to learn from data. So, let's delve deeper into what machine learning is. Machine learning is a fascinating field that enables computers to learn from data and make decisions without direct human intervention. Although it may seem complex at first glance, the basic concept is actually quite simple. Machine learning is essential for driving artificial intelligence and technological progress. Thanks to it, we have personalized recommendations on Netflix, efficient search engines, autonomous cars, and much more.

Instead of predefining all the steps to solve a problem, machine learning enables computers to independently recognize patterns in large datasets and draw conclusions based on those patterns. The basic idea is to train algorithms on data, and after training, use them for decision making, image recognition, event prediction, or data classification. For example, a machine learning algorithm can learn to recognize faces based on thousands of sample images, or predict future values based on historical data. There are different types of machine learning, including:

  • Supervised learning: The algorithm learns from labeled data where inputs and outputs are known to predict outputs for new, unknown inputs.
  • Unsupervised learning: The algorithm tries to find hidden structures in data without predefined labels or answers.
  • Reinforcement learning: The algorithm learns through interaction with the environment, trying to maximize a specific goal or reward.
Machine learning
is applied in many industries, including medicine, finance, autonomous vehicles, speech and image recognition, content recommenders, and more.

Machine Learning in Practice: Different Types of Machine Learning Algorithms

Wednesday, September 11, 2024

NumPy For Machine Learning, An Introduction to the Python Library for Manipulating Arrays and Matrices

NumPy is a Python library providing support for efficient operations on multi-dimensional arrays and matrices. It’s a cornerstone tool for scientific computing in Python and is widely used in fields like data analysis, machine learning, signal processing, visualization, and many others. Its popularity has surged alongside the rapid advancements in AI. Originally, the library was an extension to Python, first worked on by software engineer Jim Hugunin, who left Microsoft to join Google. However, the NumPy we know today is largely the work of Travis Oliphant, often considered the primary creator of NumPy, founder of Anaconda, and the SciPy package in Python.

This open-source library became significant primarily because it addressed the slowness of Python interpretation. NumPy solves this by providing multi-dimensional arrays, functions, and operators that work efficiently with these arrays. When using NumPy, you write code with fewer inner loops. Thus, any algorithm expressible as operations on arrays and matrices can run nearly as fast as equivalent C code.

NumPy's usage and functionality are often compared to the MATLAB environment, as both interpret code and allow users to quickly write computations, with most operations performed on arrays and matrices rather than scalar values. Compared to MATLAB, which originated in 1970, NumPy is integrated into Python, a modern, complete, and natively compiled programming language. However, both languages rely on BLAS and LAPACK for efficient linear algebraic computations.

The moment a programmer discovers the true power of the NumPy library

The moment a programmer discovers the true power of the NumPy library

A central element in NumPy is the ndarray - an n-dimensional array, representing a multi-dimensional homogeneous array of elements of the same data type. NumPy provides efficient operations on these arrays, including mathematical, logical, statistical, and linear algebraic operations. Additionally, NumPy has a large number of built-in functions for working with arrays and the ability to easily read and write data in various formats. While this might sound complex in theory, using the NumPy library is straightforward in practice, despite being crucial for numerical computing in Python. Python was not initially designed for numerical computing but has attracted the attention of the scientific and engineering community.

As a result, a special interest group called Matrix-SIG was founded in 1995 with the goal of defining a set of computational packages for numerical computing. Thanks to NumPy, Python has become a powerful language for numerical computing, data analysis, machine learning, and other areas of scientific research. Regarding NumPy's limitations, it is designed for homogeneous data, requires arrays to be pre-defined in size, array operations require additional memory, lacks out-of-the-box parallelization, and has limited support for non-numerical operations. Despite these limitations, NumPy still provides exceptional value and efficiency in numerical computing. Many of these limitations can be overcome by using other libraries or customizing the code to specific needs.

NumPy in Action: A Practical Guide for Beginners

Monday, September 09, 2024

AI for Beginners, A Simple Introduction to the World of Artificial Intelligence

In recent years, the world has witnessed a surge in interest and development of AI - Artificial IntelligenceThis groundbreaking technology has rapidly infiltrated various sectors, from healthcare and finance to transportation and entertainment. Despite the growing popularity of AI, many people still wonder what AI actually is and how to get involved in such a large field of progress that surrounds us more and more and leaves us breathless with its possibilities. As far as we understand, AI - Artificial Intelligence is a broad field of computer science that deals with creating intelligent agents, which are systems that can reason, learn, and act autonomously. AI is used to solve a wide range of problems, from medical diagnosis to self-driving cars. The advancements in AI have been nothing short of remarkable.

From sophisticated language models capable of generating human-quality text to self-driving cars navigating complex urban environments, AI has demonstrated its immense potential. Its applications are vast and ever-expanding, promising to revolutionize industries and improve our daily lives in countless ways. Artificial intelligence is a branch of computer science that deals with the development of computer systems that have the ability to perform tasks that require intelligence, similar to or even surpassing the capabilities of the human mind. Artificial intelligence focuses on the development of algorithms, techniques and models that enable computers to think, learn, understand and make decisions and communicate in a human-like manner. It is having an increasing impact on different industries and sectors, transforming the way work is done, decisions are made and technology is interacted with.

Simply put, artificial intelligence allows computers to automate and simulate human intelligence, relying on the principles and ways in which the human brain processes information so that computers can think and act like humans and even better. Using artificial intelligence, computers are able to perform tasks that normally require human intelligence. But how does artificial intelligence work? It essentially imitates human thought processes while at the same time stimulating human senses such as listening, speaking, understanding language, memory, thinking, vision and movement. Essentially, artificial intelligence attempts to mimic the cognitive abilities of the human mind using algorithms and models that allow computers to make decisions, learn from experience, and perform complex tasks in a manner similar to human thinking and action.

AI - Artificial Intelligence is changing everything

AI - Artificial Intelligence is changing everything

There are many ways to categorize artificial intelligence, but some distinctions have become more widely accepted than others. Let's start with the most basic and common division:

NAI - Narrow Artificial Intelligence or Narrow AI: This refers to AI systems designed to perform a specific task with a high degree of efficiency and accuracy. These systems are narrowly focused and lack the ability for general understanding or performing other types of tasks. For example, a narrow AI system could be developed to recognize objects in images, recommend products, perform automatic translation, or drive a self-driving car.

GAI - General Artificial Intelligence GAI: This refers to a hypothetical AI system that would possess human-level intelligence and be able to understand, learn, and carry out a wide range of tasks at or above the level of human intelligence. A general AI would be able to apply knowledge and skills acquired in one domain to solve problems in other domains. The idea of general AI is that a machine would have the ability to think and act like a human, adapting to new situations, learning from experience, and making creative decisions.

Superintelligence: This refers to a level of AI that surpasses the intellectual capacity of the most intelligent humans in all domains. A superintelligence would be vastly more intelligent than humans and would have the potential to surpass human capabilities in all aspects. This is a futuristic concept and has not yet been achieved, but it is often discussed in the context of the long-term development of artificial intelligence.

Beyond this basic classification, artificial intelligence encompasses various branches and areas of research. Let's take a look at some of the important branches of artificial intelligence.

Key Branches of Artificial Intelligence: Growth and Development