Archiv des Autors: mga010

The Ethics of AI

I recently have to read a lot about the consequences of AI with respect to ethics and moral. Let me quickly sketch my personal view on the subject.

First, let us talk about science fiction and AI robots that act like humans. In my view, it is rather simple:

Machines are not human beings,
even when they are equipped with AI or look like humans.
Robots are not part of our society.

In fact, it is probably better for us if the robots do not look like humans at all. That helps us keep the distinction. Machines, with AI or not, should be marked as machines.

The reason I am so sure and definitive about that is not a religious one. We humans are programmed genetically to be part of a human community. Our very self and the purpose of our life depends on other humans. Being lonely, an outlaw, or even just being pushed around make us sick. Children, friends, and admiration make us happy. We are proud to take responsibility for others. It is a misconception to think that people are basically selfish.

Now you can argue that the social system we live in could be enhanced and improved by AI robots which are on our level or even above, both in moral acting and in intellectual capabilities. These robots could live among us just like our friends. Wouldn’t that possibly be a better society? Aren’t humans inherently unreliable and the machines are not?

But any AI that acts in predictable and reliable ways is not an AI at all. It would act mechanical and can easily be determined to be a machine. The very characteristic of intelligence is that it explores new paths occasionally. This makes it unreliable and not predictable by definition. Soon, most AI will even act in super-human ways that we are incapable of understanding due to our limited capacities.

Thus, my only conclusion is that we need to distinguish ourselves from the AI we have. We need to treat AI as machines that we use. Robots are not a product of nature that can live with us uncontrolled with its own rights. We need to protect mankind, our society, and ourselves.

After all that science fiction nightmares, let us talk about the current way we use AI to improve technical systems. Even that limited form of AI already poses questions of ethics and moral.

Some see a lot of problems arising with AI or fuzzy logic that are built into cars, airplanes or surveillance systems. It is indeed true that these systems have the potential of a collision between a human decision and an AI decision. But that is already true without AI technology! We all experienced a system that seems to act on its own, a computer or even a mechanical device.

Think of an airplane with an AI as co-pilot. As a first example, assume the AI commands a go-around in an unstable approach. The chances are quite high that the AI is right and the pilot has made an error. I also tend to think that the AI makes much fewer mistakes throughout the complete flight as a co-pilot would. I would definitely prefer an AI to a rookie on the right seat. The same applies to nearly all applications of AI in technology. Usually, the AI is better. It can even be used to train the pilot in a simulator, much better and versatile than a human instructor.

It must be clear, however, that the responsibility for the proper working of the AI is at the developer of the plane, car or technical system that uses it. The AI is not a human that we can make responsible or even punish for mistakes. The developer has to test the system thoroughly to make sure it works as intended. But if you think this is too difficult note that it is much more difficult to test a human, and the verdict is much less reliable.

It must also be clear at all times that the AI that makes these decisions is a machine. When we let cars drive on their own the car does not become a being on our level, however intellectually superior it may be. If it does not work it will be trashed or repaired.

The problem with AI and robots should not be that they will be superior to us in many ways. The problem is that we need to treat them as our enslaved machines, and not as part of our society.

By the way, I am not afraid of super-human AI. Indeed, humanity deserves a lesson in humiliation. But let us use AI to our benefit!

Python or Java – What is better as first language?

Whenever I have the pleasure to teach the beginner course in programming I start with the same question: Python or Java? Since colleagues have good success with Python and since Python is a useful language I am tempted ever again. However, I decide to go with Java every time. Let me explain the reasons.

This year, I even started a discussion on a Python forum to push me towards Python and tell me why I should prefer it. The discussion was an interesting read, but nothing that convinced me.

So I did my own research on Python sites designed for beginners and see how they handle the teaching. As expected, they dive into Python and use its data collections almost from the start, at least shortly after using the Python command line for interactive computations. Some even go quickly into real world usage with Python packages for graphics, artificial intelligence or numerical mathematics.

If Python is that useful and if there are so many libraries available and also quite nice IDEs (my favorite is Spyder) why do I shy back to use it as first language? Here are some reasons.

  • I am teaching mathematicians. Mathematicians are meant to program, to test and to improve basic algorithms. They may also use high level libraries for research, but they should at least be able to start on the roots of the code, and often that is the only way to go. Python is not designed for this use. First, it is a lot slower than Java (which is on the level of C code, by the way). The difference can be a factor of 10 to 50. Then, it hides the basic data types (such byte, short, int, long, float, double) and their problems from the user and encourages to use its infinite arithmetic. I feel that mathematicians should understand the bits and bytes, and get full, speedy control of their code.
  • Python extends its language to compactify the handling of data sets at the cost of extending the language with almost cryptic syntax. E.g., Python can automatically and in one expression create a list of integers satisfying a certain condition or select from another list under a condition. I am convinced that this kind of behind-the-scene operation is not suited to teach beginners how things work and how to write good and clean code. The situation is a bit like Matlab where everything is vectored and loops are costly. You learn how to use the high-level constructs of your language, but not the basics. If you only have a hammer everything looks like a nail.
  • Python is as multi-platform or open-source as Java. Both need an interpreter and cannot be translated to native code to ease implementation of bigger programs on the user systems. Java is even a bit better in this respect. Both languages cannot easily use the native GUI library or other more hardware bound libraries. Python is a bit better here. But this is nothing to worry about as a first language anyway.
  • I also need to answer the question: Is it easier to go from Java to Python or the other way? And, for me, the answer is clearly that Java provides the more basic knowledge on programming. Python is on a higher level. It is more a scripting language than a basic programming language. One may argue that this is good for beginners since they get a mighty system right at the start. After all, you learn to drive a car without knowing how the motor works. But I have seen enough programs in Python to know that it spoils the programmer and drives the thoughts away from a fresh, more efficient solution.

This said, I would very much welcome a second course building on the basics that uses Python and its advanced libraries. Many students are taught R and Matlab in the syllabus. I think Python would be a better choice than Matlab for numerical programming and data visualization. And, in contrast to Matlab, it is open and free.

The Riemann Sphere

Since I am currently teaching complex function theory I am interested in visualizations of Möbius transformations as well as other analytic functions. The Riemann Sphere together with the Stereographic Projection is a good tool for this.

Above you see the basic idea. The points on the sphere are projected along a line through the north pole to the x-y-plane which represents the complex plane. The north pole associates to the point infinity. The formulas for the projection and its inverse are easy to compute with a bit of geometry. I have done so on this page using EMT and Maxima:

It is quite interesting to see what a Möbius transformation does to the Riemann sphere.

In this image, the north and south poles are mapped to the two black points. The other lines are the images of the circles of latitude and longitude. The image can be created by projecting the usual circles (with the poles in place) to the complex plane, applying the Möbius transformation, and projecting back to the sphere.

In fact, the projection and the Möbius transformation map circles or lines to circles or lines. They preserve angles. This is why the Stereographic Projection is often used for maps that should show the correct angles between paths. But, note that great circles that do not path to the pole are not mapped to lines, but to circles. So the shortest path on the surface of the Earth between two points is a circle on the projected map. Locally, this is only approximated by a line segment.

A Simple Combinatorial Approximation Problem

This is certainly a well-known problem. But I pose it here because there is a geometric solution, and geometric solutions are always nice.

Assume you have two sets of n points

\(x_1,\ldots,x_n, \quad y_1,\ldots,y_n\)

The problem is to find a permutation p such that

\(\sum_{k=1}^n (x_k-y_{p(k)})^2\)

is minimized. Of course, we can assume that the x’s are sorted. And it is easy to guess that the minimum is achieved if the permutation sorts the y’s. But how to prove that?

Here is a simple idea. We claim that swapping two of the y’s into the right order makes the sum of squares smaller. I.e.,

\(x_1 < x_2, \, y_1<y_2 \Longleftrightarrow (x_1-y_1)^2+(x_2-y_2)^2 < (x_1-y_2)^2+(x_2-y_1)^2\)

If you try that algebraically the right hand side is equivalent to

\(-x_1y_1-x_2y_2 < -x_1y_2-x_2y_1\)

Putting everything one side and factorizing, you end up with

\((x_2-x_1)(y_2-y_1) > 0\)

and the equivalence is proved.

But there is a geometric solution. We plot the points into the plane. Two points are on the same side of the symmetry line between the x- and y-axis if they have the same order. And the connection to the mirrored point will always be larger.

We can extend this result to more norms. E.g., the sorting of the y’s will also minimize

\(\sum_{k=1}^n |x_k-y_{p(k)}|, \, \max_k |x_k-y_{p(k)}|\)

As long as the underlying norm is symmetric with respect to the variables, i.e.,


The prove for this can be based on the same geometrical argument.

The speed of C, Java, Python versus EMT

I recently tested Python to see if it is good as a first programming language. The answer is a limited „yes“. Python has certainly a lot to offer. It is a mainstream language and thus there is a lot of support in the net, and there are tons of libraries for all sorts of applications. A modern version like Anaconda even installs all these tools effortlessly.

However, I keep asking myself if it isn’t more important to learn the basics before flying high. Or to put it another way: Should we really start to teach the usage of something like a machine learning toolbox before the student has understood data types and simple loops? This question is important because most high-level libraries take away the tedious work. If you know the right commands you can start problem-solving without knowing much of the underlying programming language.

I cannot decide for all teachers. It depends on the type of students and the type of class. But I can give you an idea of how much a high-level language like Python is holding you back if you want to implement some algorithm yourself. You have to rely on a well-implemented solution being available, or you will be lost.

So here is some test code for the following simple problem. We want to count how many pairs (i,j) of numbers exist such that i^2+j^2 is prime. For the count, we restrict i and j to 1000.


int isprime (int n)
	if (n == 1 || n == 2) return 1;
	if (n % 2 == 0) return 0;
	int i = 3;
	while (i * i <= n)
		if (n % i == 0) return 0;
		i = i + 2;
	return 1;

int count(int n)
	int c = 0;
	for (int i = 1; i <= n; i++)
		for (int j = 1; j <= n; j++)
			if (isprime(i * i + j * j)) c = c + 1;
	return c;

int main()
	clock_t t = clock();
	printf("%g", (clock() - t) / 1000.0);

Result was:

So this takes 0.132 seconds. That is okay for one million prime-number checks.

Below is a direct translation in Java. Surprisingly, this is even faster than C, taking 0.127 seconds. The reason is not clear. But I might not have switched on every optimization of C.

public class Test {

	static boolean isprime (int n)
		if (n == 1 || n == 2) return true;
		if (n % 2 == 0) return false;
		int i = 3;
		while (i * i <= n)
			if (n % i == 0) return false;
			i = i + 2;
		return true;

	static int count(int n)
		int c = 0;
		for (int i = 1; i <= n; i++)
			for (int j = 1; j <= n; j++)
				if (isprime(i * i + j * j)) c = c + 1;
		return c;

	static double clock ()
		return System.currentTimeMillis();
	public static void main (String args[])
		double t = clock();
		System.out.println((clock() - t) / 1000.0);

	Result was:

Below is the same code in Python. It is more than 30 times slower. This is unexpected even to me. I have to admit that I have no idea if some compilation trick can speed that up. But in any case, it is the behavior that an innocent student will see. Python should be seen as an interactive interpreter language, not a basic language to implement fast algorithms.

def isprime (n):
    if n==2 or n==1:
        return True
    if n%2==0:
        return False
    while i*i<=n:
        if n%i==0:
            return False
    return True

def count (n):
    for k in range(1,n+1):
        for l in range(1,n+1):
            if isprime(k*k+l*l):
                ## print(k*k+l*l)
    return c

import time

## Result was
## 98023
## 4.791218519210815

I have a version in Euler Math Toolbox too. It uses matrix tricks and is thus very short, using built-in loops in C. But it still cannot compete with the versions above. It is about 3 times slower than Python and 100 times slower than C. The reason is that the isprime() function is not implemented in C but in the EMT language.

>n=1000; k=1:n; l=k'; tic; totalsum(isprime(k^2+l^2)), toc;
 Used 13.352 seconds

Below is a version where I use TinyC to check for primes. The function isprimc() can only handle real numbers, no vectors. Thus I vectorize it with another function isprimecv(). This takes a bit of performance.

>function tinyc isprimec (x) ...
$  int n = (int)x;
$  int res = 1; 
$  if (n > 2)
$  {
$     if (n%2 == 0) res=0;
$     int i=3;
$     while (i*i<=n)
$     {
$         if (n%i==0) { res=0; break; }
$         i = i+2;
$     }
$  }
$  new_real(res);
$  endfunction
>function map isprimecv (x) := isprimec(x);
>n=1000; k=1:n; l=k'; tic; totalsum(isprimecv(k^2+l^2)), toc;
   Used 0.849 seconds

More Remarks on Corona

It is, of course, interesting how the total mortality behaves in times of Corona. Unfortunately, I was searching in vain for numbers in Germany. What I found was a PDF file from Spain. I took the first statistics in the image above from it.

In Germany, each year about 1.1% of the population will die, which is a rate of one in approximately 31000 per day. So I would expect around 1450 deaths each day in Spain. The rate depends, of course, strongly on the age structure of the population. Maybe, I underestimate the pyramidal shape of the ages in Spain, so that the daily total mortality is indeed a bit lower than in Germany. The image suggests around 1200.

A statistical plot of that kind should always have a y-axis that starts with zero. So this plot is a bit misleading. But in any case, it suggests a 50% increase in total mortality per day in the last few days. That is due to Corona. As I said, the problem will average out over the year if appropriate measures are taken, and we might observe only a slight increase in total mortality.

Nevertheless, the main problem remains: We cannot handle the huge numbers of severely sick patients that would occur if we just let everything run as normal.

More Remarks on Corona

We learned a bit more now.

  • I am glad to hear that the social restrictions lead to a base infection rate of approximately one in Germany, i.e., one new infection per infectious. That is good news, but it must be even less. The number of cases must decrease to a number we can handle. Currently, my estimate is that every 100th German is infected, and probably the same in the US. That sounds manageable with care in social interactions.
  • And, of course, we still need to restrict travel, especially to countries that cannot take the measures we can because they are too poor. Those countries need immediate help. And, for god’s sake, please relieve the sanctions that the West has imposed to punish Eastern countries for not being cooperative.
  • Opening the kindergartens and schools is crucial for society. But it also imposes the biggest thread. Every nurse can tell you that one case of chickenpox means 30 cases of chickenpox where kids are playing together in larger numbers. And young parents will go through an unknown peak of strep infections while their kids are in that age. So, we need antibody tests as soon as possible, plus a new look at hygiene.
  • I always opted for more digital learning. Some don’t agree because they either think I wanted to replace teachers, or they underestimate the benefits of lonely study, trying, failing and thinking. It is obvious that learning is also a social activity and that teachers are needed for guidance. But some stuff must be done alone, and that can be very well stimulated by digital media.

More Remarks on Corona

What have we learned so far? The numbers are not clear, nor are the consequences. Moreover, there is a lot of distorted information, some can even be called fake. Let me tell you what I took from all that.

  • The most important problem that the new virus is causing is the exceeding number of cases of pneumonia and ARDS caused by fluid in the lungs. This seems to be a lot worse than in the ordinary flu or older Coronaviruses, probably because we are not yet immune. The virus is also causing other damages to your body, but lung problems are the most common ones. While the chances to need oxygen or ventilation are not known precisely yet and will depend on your age, the situation in the hospitals of the affected regions speaks a clear language. We need to concentrate on medical care and break the chains of infection to slow down the development of the disease!
  • The mortality seems to be high only in people of old age, especially those that are weakened by other diseases. We might observe that the overall mortality does not rise much at all. The number of deaths by COVID19 published so far all over the world supports this view. The conclusion is to protect the elderly as much as we can to spare them a painful and sudden death requiring intense medical care. We do that, again, by reducing social contact and keeping the number of infected people at any time as low as possible. By the way, hygiene and care should have been observed in contact with elderly people at all times.
  • The current economy is based on the production of consumer articles and marketing. Many countries have reduced or have been forced to reduce state activities that do not lead to direct profit, such as health care and education. This economy is not fit for any crisis that needs social solidarity. Since profit and interest are the driving forces and the main means of wealth distribution each recession throws us into a deep crisis. Politicians see that now and try to stabilize the situation. Essentially, this is a takeover and a regulation of the industry by financial means, which can be summed up as printing money. Even the idea of state control of key industries is revived. We should learn from that crisis that there must be a balance between the private sector of the economy and social needs covered by the state at all times. If that is out of balance, we are not flexible enough to fight a crisis like this one.

Keep healthy and stay at home for a while!

Remarks on the Corona Pandemic

Here are some remarks that I noticed in these difficult times. Some are math-related, some are not.

  • Humans seem to be unable to image big numbers or grasp the concept of exponential growth. In fact, I cannot really do that myself. What I have to do is compute figures and set them in relation to the total. To help us understand the problem here is one example: Letting the virus spread in a small town of 10000 can easily mean that 1/4 of them are infected at the same time, i.e., approximately 2500. About 1/20 of those need intensive medical treatment, and at least 1/100 need ventilation. We end with a number of 25. This is impossible to do for any countryside hospital. Those small numbers can be understood and they should frighten you to the point that you understand how necessary it is to break the chains of infection. Of course, the same computation for the complete US or German population is even more frightening.
  • Most of us do not understand how bad it is that the world population will grow to eleven billion in the near future. We should have stopped that by education, sharing of wealth and women’s rights thirty years ago. Moreover, we have destroyed local support chains by globalism and our excessive capitalistic system. This falls on our feet now. In African countries, the population has to live in slums at close contact and is depending on the global food chain, and also on the medical support of „first world“ countries. This is a recipe for disaster. Even we are affected by the problem because our support with masks and other medical aids is broken since China is no longer delivering. My only hope is that we come to rethink our world after that crisis.
  • We all cannot grasp probabilities properly. I am now over 60 years old. That means that my non-corona chance of dying in the next year exceeds the corona related chance, even if I catch the virus. I am more likely to die of a heart attack, get brain or lung cancer, get involved in a car accident, or get usual pneumonia during that year. We cannot avoid death. It is never a good idea to think about the chance to die too much. Remember the words of Epicurus: „The death is none of your business. When he is here you aren’t. When you are here he isn’t.“
  • The press and TV are now running on full speed to shovel numbers onto the public and present the drama of single cases. Unfortunately, this agenda together with a lack of social contact may cause a lot of harm to some individuals. If only we would take other causes of unnecessary deaths with the same degree of attention! I would prefer a more scientific attitude in the media, and more reticence to dramatize individual fates. This only leads to a distorted view of reality.

Mathematics of an Epidemic Disease

Since we are currently fighting the Corona epidemy I thought it might be a good idea to simulate the spreading of such a disease in Euler Math Toolbox. The numbers here are fictitious and can only deliver qualitative results.

The mathematical model I took is quite simple.

  • The disease has n1=7 days of incubation. During this time the infected person can infect others.
  • The disease has n2=7 days of severe sickness. I have assumed that there are no further infections during that time.
  • We start with the a-immune population plus 10 infected cases.
  • The chance for infection depends on the probability to contact an infected person during the incubation time and the number of contacts each day that could lead to an infection. I assume that a person has nc=0.2 to nc=0.3 contacts each day which could lead to an infection.
  • In the end, the population is either in the non-immune or the immune group.

So the total population is mapped into a vector x of groups as

  • x[1] : non-immune
  • x[2] to x[1+n1] : incubation
  • x[2+n1] to x[1+n1+n2] : sick
  • x[2+n1+n2] : immune

The formula for the number of new infections in x[2] (first day of incubation) is

\(x[2] = x[1] \left(1-(1-\frac{S}{P})^{n_c}\right)\)

where S is the number of incubated people and P is the total population. So 1-S/P is the chance that the contacted person is not infected, and thus the factor computes the chance for infection. Note that this factor is very small at the start since S/P is very small. Yet the exponential growth does its job over time.

Here is a typical plot of the epidemy with nc=0.3. This is equivalent to contacting 9 other people in a month in a way that can spread the disease. We see that almost 90% of the population has to go through the disease in this case.

Development with nc=0.3

Here is another plot with nc=0.25, equivalent to contacting 7.5 other people in a month.

Development with nc=0.25

That reduces the number of cases a lot and limits the peak number of sick persons from 37% to 22%. For the medical system, this is an enormous relief.

The code of Euler Math Toolbox to do these plots is listed below. You can cut and paste it into EMT to try yourself.

>n1=7; // 7 days of incubation time
>n2=7; // 7 days of sickness
>pm=0.01; // mortility rate
>pop=50000000; // population
>nc=0.3; // number of infectious contacts
>function oneday (x,n1,n2,pm,pop,nc) ...
$xn[2]=floor(xn[1]*(1-(1-sum(x[2:1+n1])/pop)^nc)); // new infections
$xn[1]=x[1]-xn[2]; // remaining uninfected
$xn[3:2+n1+n2]=x[2:1+n1+n2]; // progress one day 
$return xn;
>x=zeros(2+n1+n2); x[1]=pop; x[2]=10; // start with 10 cases
>function sim (x,n1,n2,pm,pop,nc) ...
$   xn=oneday(x,n1,n2,pm,pop,nc);
$   X=X_xn;   
$   until sum(xn[2:1+n1+n2])==0;
$   x=xn;
$return X;
>plot2d(Xs'/pop,color=[green,orange,red,blue]); ...
>labelbox(["immune-","infect+","sick","immune+"], ...
>  colors=[green,orange,red,blue],x=0.4,y=0.2,w=0.35):

We can also understand the problem a bit better if we study the number of new infections in x[2], the first day of incubation. It starts with our chosen number of initial infections. But those 10 induce 3 new infections and the 13 new take another 3. Then we have 16 and they infect 4, and so on exponentially.

New Infections

If the chance S/P of an infected contact is small we have

\(\left(1-(1-\frac{S}{P})^{n_c}\right) \sim \frac{n_c S}{P}\)


\(x[2] \sim n_c S\)

exactly as observed.