# Robert's Stochastic thoughts

Asymptotically we'll all be dead

## Sunday, June 16, 2024

### A Natalist, Nativist, Nationalist Case for the Child Tax Credit

## Thursday, March 07, 2024

### Avatars of the Tortoise III

He concluded ""We (the indivisible divinity that works in us) have dreamed the world. We have dreamed it resistant, mysterious, visible, ubiquitous in space and firm in time, but we have allowed slight, and eternal, bits of the irrational to form part of its architecture so as to know that it is false."

I think I might have something interesting to say about that and I tried to write it here. If you must waste your time reading this blog, read that one not this one. But going from the sublime to the ridiculous I have been a twit (but honestly not a troll) on Twitter. I said I thought we don't need the concept of a derivative (in the simple case of a scalar function of a scalar the limit as delta x goes to zero of ther ratio delta Y over delta x - I insult you with the definition just to be able to write that my tweet got very very ratiod).

In avatars of the Tortoise II I argued that we can consider space time to be a finite set of points with each point in space the same little distance from its nearest neighbors and each unit of time the same discrete jump littleT from the most recent past. If I am right we don't need derivatives to analyse functions of space, there are just slopes, or time, or position as a function of time (velocity and acceleration and such). In such a model, there are only slopes as any series which goes to zero, gets to zero after a finite number of steps and the formula for a derivative must include 0/0.

I will make more arguments against derivatives. First I will say that we learn nothing useful if we know the first, second, ... nth ... derivative of a function at X. Second I will argue that we can do what we do with derivatives using slopes. Third I will argue that current actual applied math consists almost entirely of numerical simulations on computers which are finite state autometa and which do not, in fact, handle continuums when doing the simulating. They take tiny little steps (just as I propose).

I am going to make things simple (because I don't type so good and plane ascii formulas are a pain). I will consider scalar functions of scalars (so the derivative will be AB ap calculus). I will also consider only derivatives at zero.

f'(0) = limit as x goes to zero of (f(x)-f(0))/x

that is, for any positive epsilon there is a positive delta so small that if |x|

To go back to Avatars of the tortoise I, another equally valid definition of a derivative at zero is, consider the infinite series X_t = (-0.5)^t.

f'(0) = the limit as t goes to infinity of (f(x_t)-f(0))/x_t that is, for any positive epsilon there is a positive N so big that, if t>N then |f'(0) - (f(x_t)-f(0))/x_t|< epsilon.

so we have again the limit as t goes to infinity and the large enough N with know way of knowing if the n which interests us (say 10^1000) is large enough. Knowing the limit tells us nothing about the billionth element. The exact same number is the billionth element of a large infinity of series some of which converge to A for any real number A, so any number is as valid an asymptotic approximation as any other, so none is valid.

Now very often the second to last step of the derivation of a derivative includes an explicit formula for f'(0) - (f(x)-f(0))/x and then the last step consists of proving it goes to zero by finding a small enough delta as a function of epsilon. That formula right near the end is useful. Te derivative is not. Knowing that there is a delta is not useful if we have no idea how small it must be.

In general for any delta no matter how small for any epsilon no matter how small, there is a function f such that |f'(0) - (f(delta)-f(0))/delta|>1/epsilon (I will give an example soon). for any function there is a delta does not imply that there is a delta which works for any function. The second would be useful. The first is not always useful.

One might consider the first, second, ... nth derivatives and an nth order Taylor series approximation which I will call TaylorN(x)

for any N no matter how big, for any delta no matter how small for any epsilon no matter how small, there is a function f such that |TaylorN(delta) - f(delta)|>1/epsilon

for example consider the function f such that

f(0) = 0, if x is not zero f(x) = (2e/epsilon)e^(-(delta^2/x^2))

f(delta) = 2/epsilon > 1/epsilon.

f'(0) is the limit as x goes to zero of

-(2e/epsilon)(2delta^2/x^3)e^(-(delta^2/x^2)) = 0.

the nth derivative the limit as x goes to zero of an n+2 order polynomial times e^(-(delta^2/x^2)) and so equals zero.

The Nth order taylor series approximation of f(x) equals zero for every x. for x = delta it is off by 2/epsilon > 1/epsilon.

There is no distance from zero so small and no error so big that there is no example in which the Nth order Taylor series approximation is definitely not off by a larger error at that distance.

Knowing all the derivatives at zero, we know nothing about f at any particular x other than zero. Again for any function, for any epsilon, there is a delta, but there isn't a delta for any function. Knowing all the derivatives tells us nothing about how small that delta must be, so nothing we can use.

So if things are so bad, why does ordinary caluclus work so well ? It works for a (large) subset of problems. People have learned about them and how to recognise them either numerically or with actual experiments or empirical observtions. But that succesful effort involved numerical calculations (that is arithmetic not calculus) or experiments or observations. It is definitely Not a mathematical result that the math we use works. Indeed there are counterexamples (of which I presented just one).

part 2 of 3 (not infinite even if it seems that way but 3. If the world is observationally equivalent to a word with a finite set of times and places, then everything in physics is a slope. More generally, we can do what we do with derivatives and such stuff with discrete steps and slopes. We know this because that is what we do when faced with hard problems without closed form solutions. We hand them over to computers which consider a finite set of numbers with a smallest step. and that quickly gets me to part 3 of 3 (finally). One person on Twitter says we need to use derivatives etc to figure out how to write the numerical programs we actually use in applications. This is an odd claim. I can read (some) source code (OK barely source code literate as I am old but some). I can write (some) higher higher language source code. I can force myself to think in some (simple higher higher language) source code (although in practice I use derivatives and such like). Unpleasant but not impossible.

Someone else says we use derivatives to know if the simulation converges or, say, if a dynamical system has a steady state which is a sink or stuff like that. We do, but tehre is no theorem that this is a valid approach and there are counterexamples (basically based on the super simple one I presented). All that about dynamics is about *local* dynamics and is valid if you start out close enough and there is no general way to know how close is close enough. In practice people have found cases where linear and Taylor series (and numerical) approximations work and other cases where they don't (consider chaotic dynamical systems with positive Lyaponoff exponents and no I will not define any of those terms).

Always the invalid pretend pure math is tested with numerical simulations or experiments or observations. People learn when it works and tell other people about the (many) cases where it works and those other people forget the history and pour contempt on me on Twitter.

### Avatars of the Tortoise II

He concluded ""We (the indivisible divinity that works in us) have dreamed the world. We have dreamed it resistant, mysterious, visible, ubiquitous in space and firm in time, but we have allowed slight, and eternal, bits of the irrational to form part of its architecture so as to know that it is false."

I think rather that we have dreamed of infinity which has no necessary role in describing the objective universe which is "resistant, mysterious, visible, ubiquitous in space and firm in time*".

First the currently favored theory is that space is not, at the moment, infinite but rather is a finite hypersphere. There was a possibility that time might end in a singularity as it began, but the current view is that the universe will expand forever. Bummer. I note however that the 2nd law of thermodynamics implies that life will not last forever (asymptotically we will "all" be dead, "we" referring to living things not currently living people). So I claim that there is a T so large that predictions of what happens after T can never be tested (as there will be nothing left that can test predictions.

However it is still arguable (by Blake) that we can find infinity in a grain of sand and eternity in an hour. Indeed when Blake wrote, that was the general view of phyicists (philosophy makes the oddest bedfellows) as time was assumed to be a continuum with infinitely many distinct instants in an hour.

Since then physicists have changed their mind -- the key word above was "distinct" which I will also call "distinguishable" (and I dare the most pedantice pedant (who knows who he is) to challenge my interchanging the two words which consist of different letters).

The current view is that (delta T)(delta E) = h/(4 pi) where delta T is the uncertainty in time of an event, delta E is the uncertainty in energy involved, h is Planck's constant, pi is the ratio of the circumpherance of a circle to it's diameter and damnit you know what 4 means.

delta E must be less that Mc^2 where M is the (believed to be finite) mass of the observable universe. So there is a minimum delta T which I will call littleT. A universe in which time is continuous (and an hour contains an infinity of instants) is observationally equivalent to a universe in which time (from the big bang) is a natural number times littleT. The time from the big bang to T can be modeled as a finite number of discrete steps just as well as it can be modeled as a continuum of real numbers. This means that the question of which if these hypothetical possibilities time really is is a metaphysical question not a scientific question.

Now about that grain of sand. there is another formula

(delta X)(delta P) = h/(4 pi)

X is the location of something, P is its momentum. |P| and therefore delta P is less than or equal to Mc/2 where M is the mass of the observable universe. The 2 appears because total momentum is zero. This means that there is a minimum delta X and a model in which space is a latice consisting of a dimension zero, countable set of separated points is observationally equivalent to the standard model in which space is a 3 dimensional real manifold. Again the question of what space really is is metaphysical not scientific.

Recall that space is generally believed to be finite (currently a finite hypersphere). It is expanding. At T it will be really really big, but still finite. That means the countable subset of the 3 dimensional manifold model implies a finite number of different places. No infinity in the observablee universe let alone in a grain of sand

There are other things than energy, time, space and momentum. I am pretty sure they can be modeled as finite sets too (boy am I leading with my chin there).

I think there is a model with a finite set of times and of places which is observationally equivalent to the standard model and, therefore, just as scientifically valid. except for metaphysics and theology, I think we have no need for infinity. I think it is not avatars of the tortoise all the way down.

*note not ubiquitous in time as there wass a singularity some time ago.

## Wednesday, March 06, 2024

### Asymtotically we'll all be dead II

Asymptotically we'll all be dead didn't get much of a response, so I am writing a simpler post about infinite series (which is the second in a series of posts which will not be infinite. First some literature "Avatars of the Tortoise" is a brilliant essay by Jorge Luis Borges on paradoxes and infinity. Looking at an idea, or metaphor (I dare not type meme) over centuries was one of his favorite activities. In this case, it was alleged paradoxess based on infinity. He wrote "There is a concept which corrupts and upsets all others. I refer not to Evil, whose limited realm is that of ethics; I refer to the infinite."

When I first read "Aavatars of the Tortoise" I was shocked that the brilliant Borges took Zeno's non paradox seriously. The alleged paradox is based on the incorrect assumpton that a sum of an infitite numbr of intervals of time adds up to forever. In fact, infite sums can be finite numbers, but Zeno didn't understand that.

Zeno's story is (roughly translated and with updated units of measurement)

consider the fleet footed Achilles on the start line and a slow tortoise 100 meters ahead of him. Achilles can run 100 meters in 10 seconds. The tortoise crawls forward one tenth as fast. The start gun goes off. In 10 seconds Achilles reaches the point where the tortoise started by the tortoise has crawled 10 meters (this would only happen if the tortoise were a male chasing a female or a female testing the male's fitness by running away - they can go pretty fast when they are horny).

So the race continues to step 2. Achilles reaches the point where the tortoise was after 10 seconds in one more second, but the tortoise has crawled a meter.

Step 3, Achilles runs another meter in 0.1 seconds, but the tortoise has crawled 10 cm.

The time until Achilles passes the tortoise is an infinite sum. Silly Zeno decided that this means that Achilles never passes the tortoise, that the time until he passes him is infinite. In fact a sum of infinitely many numbers can be finite -- in theis case 10/(1-0.1) = 100/9 < infinity.

Now infinite sums can play nasty tricks. Consider a series x_t t going from 1 to infinity. If the series converges to x, but does not converge absolutely (so sum |x_t| goes to infinity) then one can make the series converge to any number at all by changing the order in which the terms are added. How can this be given the axiom that addition is commutative. Now that's a bit of a paradox.

the proof is simple, let's make it converget to A. DIrst note that the positive terms must add to infinity and the negative terms add to - infinity (so that they cancel enough for the series to converge).

now add positive terms until Sumsofar >A (if A is negative this requires 0 terms). Now add negative terms until sumsofar

That's one of the weird things infinity does. I think that everything which is strongly counterintuitive in math has infinity hiding somewhere (no counterexamples have come to my mind and I have looked for one for decades).

Now I say that the limit of a series (original series of sum t = 1 1 to T) as T goes to infinity is not, in general, of any practical use, because in the long run we will all be dead. I quote from "asymptotically we'll all be dead"

Consider a simple "problem of a series of numbers X_t (not stochastic just determistic numbers). Let's say we are interested in X_1000. What does knowing that the limit of X_t as t goes to infinity is 0 tell us about X_1000 ? Obvioiusly nothing. I can take a series and replace X_1000 with any number at all without changing the limit as t goes to infinity.

Also not only does the advice "use an asymptotic approximation" often lead one astray, it also doesn't actually lead one. The approach is to imaging a series of numbers such that X_1000 is the desired number and then look at the limit as t goes to infinity. The problem is that the same number X_1000 is the 1000th element of a of an uh large infinity of different series. one can make up a series such that the limit is 0 or 10 or pi or anything. the advice "think of the limit as t goes to infinity of an imaginary series with a limit that you just made up" is as valid an argument that X_1000 is approximately zero as it is that X_1000 is pi, that is it is an obviously totally invalid argument.

An example is a series whose first google (10^100) elements aree one google so x_1000000 = 10^100, and the laters elements are zero. The series converges to zero. If one usees the limit as t goes to infinity as an approximation when thinking of X_999 then one concludes that 10^100 is approximately zero.

The point is that the claim that a series goes to x s the claim that (for that particular series) for any positive epsilon, there is an N so large that
if t>N then |x_t-x}

Knowing only the limit as t goes to infinity, we have no idea how large an N is needed for any epsilon, so we have no idea if the limit is a useful approximation to anything we will see if we read the series for a billion years.

Now often the proof of the limit contains a useful assertion towards the end of the proof. For example one might prove that |x_t-X| < A/t for some A. The next step is to note that the limit as to goes to infinity of x_t is X. This last step is a step in a very bad direction going from something useful to a useless implication of the useful statement.

Knowing A we know that N = floor(A/epsilon). That's a result we can use. It isn't as elegant as saying something about limits (because it includes the messy A and often includes a formula much messier than A/t). However, unlike knowing the limit as t goes to infinity it might be useful some time in the next trillion years.

In practice limits are used when it seems clear from a (finite) lot of calculations that they are good approximations. But that means one can just do the many but finite number of calculations and not bother with limits or infinity at all.

In this non-infinite series of posts, I will argue that the concept of infinity causes all sorts of fun puzzles,but is not actually needed to describe the universe in which we find ourselves.

## Monday, February 19, 2024

### Asymptotically We'll all be Dead

I assert that asymptotic theory and asymptotic approximations have nothing useful to contribute to the study of statistics. I therefore reject that vast bul of mathematical statistics as absolutely worthless.

To stress the positive, I think useful work is done with numerical simulations -- monte carlos in which pseudo data are generated with psudo random number generators and extremely specific assumptions about data generating processes, statistics are calculated, then the process is repeated at least 10,000 times and the pseudo experimental distribution is examined. A problem is that computers only understand simple precise instructions. This means that the Monte Carlo results hold only for very specific (clearly false) assumptions about the data generating process. The approach used to deal with this is to make a variety of extremely specific assumptions and consider the set of distributions of the statistic which result.

I think this approach is useful and I think that mathematical statisticians all agree. They all do this. Often there is a long (often difficult) analysis of asymptotics, then the question of whether the results are at all relevant to the data sets which are actually used, then an answer to that question based on monte carlo simulations. This is the approach of the top experts on asymptotics (eg Hal White and PCB Phillips).

I see no point for the section of asymptotic analysis which no one trusts and note that the simulations often show that the asymptotic approximations are not useful at all. I think they are there for show. Simulating is easy (many many people can program a computer to do a simulation). Asymptotic theory is hard. One shows one is smart by doing asymptotic theory which one does not trust and which is not trustworthy. This reminds me of economic theory (most of which I consider totally pointless).

OK so now against asymptotics. I will attempt to explain what is done -- I will use the two simplest examples. In each case, there is an assumed data generating process and a sample size of data (hence N) which one imagines is generated. Then a statistic is estimated (often this is a function of the sample size and an estimate of a parameter of a parametric class of possible data generating processes). The statistic is modified by a function of the sample size (N). The result is a series of random variables (or one could say a series of distributions of random variables). The function of the sample size N is chosen so that the series of random variables converges in distribution to a random variable (convergence in distribution is convergence of the cumulative distribution function at all points where there are no atoms so it is continuous).

One set of examples (usually described differently) are laws of large numbers. A very simple law of large numbers assumes that the data generating process is a series of independent random numbers with identical distributions (iid). It is assumed (in the simplest case) that this distribution has a finite mean and a finite variance. The statisitc is the sample average. as N goes to invinity it converges to a degenerate distribution with all weight on the population average. It is also tru that the sample average converges to a distrubiton whith mean equal to the population mean and variance going to zero - that is for any positive epsilon there is an N1 so large that the variance of the sample mean is less than epsilon (conergence in quadratic rule). Also for any positive epsilon there is an N1 so large that if the sample size N>N1 then the probability that the sample mean is more than epsilon from the population mean is itself less than epsilon (convergence in probability). The problem is that there is no way to know what N! is. In particular, it depends on the underlying distribution. The population variance can be estimated using the sample variance. This is a consistent estimate so that the difference is less than epsilon if N>N2. The problem is that there is no way of knowing what N2 is.

Another very commonly used asymptotic approximation is the central limit theorem. Again I consider a very simple case of an iid random variable with a mean M and a finite variance V.

In that case (sample mean -M)N^0.5 will converge in distribution to a normal with mean zero and variance V. Again there is no way to know what the required N1 is. for some iid sitributions (say binary 1 or zero with probability 0.5 each or uniform from 0 to 1) N1 is quite low and the distribution looks just like a normal distribution for a sample size around 30. For others the distribution is not approximately normal for a sample size of 1,000,000,000.

I have criticisms of asymptotic analysis as such. The main one is that N has not gone to infinity. Also we are not imortal and my not live long enough to collect N1 observations.

Consider an even simpler problem of a series of numbers X_t (not stochastic just determistic numbers). Let's say we are interested in X_1000. What does knowing that the limit of X_t as t goes to infinity is 0 tell us about X_1000 ? Obvioiusly nothing. I can take a series and replace X_1000 with any number at all without changing the limit as t goes to infinity.

Also not only does the advice "use an asymptotic approximation" often lead one astray, it also doesn't actually lead one. The approach is to imaging a series of numbers such that X_1000 is the desired number and then look at the limit as t goes to infinity. The problem is that the same number X_1000 is the 1000th element of a of an uh large infinity of different series. one can make up a series such that the limit is 0 or 10 or pi or anything. the advice "think of the limit as t goes to infinity of an imaginary series with a limit that you just made up" is as valid an argument that X_1000 is approximately zero as it is that X_1000 is pi, that is it is an obviously totally invalide argument.

This is a very simple example, however there is the exact same problem with actual published asymptotic approximations. The distribution of the statistic for the actual sample size is one element of a very large infinity of possible series of distributions. Equally valid asymptotic analysis can imply completely different assertions about the distribution of the statistic for the actual sample size. As they can't both be valid and they are equally valid, they both have zero validity.

An example. Consider a random walk where x_t = (rho)x_(t-1) + epsilon_t where epsilon is n iid random variable with mean zero and finite variance. There is a standard result that if rho is less than 1 then (rhohat-rho)N^0.5 has a normal distribution. There is a not so standard result that if rho = 1 then (rhohat-rho)N^0.5 goes to a degenerate distribution equal to zero with probability one and (rhohat-rho)N goes to a strange distribution called a unit root distribution (with the expected value of (rhohat-rho)N less than 0.

Once I came late to a lecture on this and casually wrote converges in distribution to a normal with mean zero and variance before noticing that the usual answer was not correct in this case. The professor was the very brilliant Chris Vavanaugh who was one of the first two people to prove the result described below (and was not the first to publish).

Doeg Elmendorf who wasn't even in the class and is very very smart (and later head of the CBO) asked how there can be such a discontinuity at rho +1 when, for a sample of a thousand observations, there is almost no difference in the joint probability distribution or any possible statistic between rho = 1 and rho = 1 - 10^(-100). Prof Cavanaugh said that was his next topic.

The problem is misuse of asymptotics (or according to me use of asymptotics). Note that the question explicity referred to a sample size of 1000 not a sample size going to infinity.

So if rho = 0.999999 = 1 - 1/(a million) then rho^1000 is about 1 but rho^10000000000 iis about zero. taking N to infinity implies that, for a rho very slightly less than one, almost all of the regression coefficients of X_t2 on X_t1 (with t1

One of them is a series where Rho varies with the sample size N so Rho_N = 1-0.001/N

for N = 1000 rho = 0.999999 so the distribution of rhohat for the sample of 1000 is just the same as before. However the series of random variables (rhohat-rho)N^0.5 does not converge to a normal distribution -- it converges to a degenerate distribution which is 0 with probability 1.

In contrast (rho-rhohat)N converges to a unit root distribution for this completely different series of random variables which has the exact same distribution for the sample size of 1000.

There are two completely different equally valid asymptotic approximations.

So Cavanough decided which to trust by running simulations and said his new asymptotic aproximation worked well for large finite samples and the standard one was totally wrong.

See what happened ? Asymptotic theory did not answer the question at all. The conclusion (which is universally accepted) was based on simulations.

This is standard practice. I promise that mathematical statisticians will go on and on about asymptotics then check whether the approximation is valid using a simulation.

I see no reason not to cut out the asymptotic middle man.

## Wednesday, October 25, 2023

### Ex Vivo culturing of NK cells and infusion

## Thursday, October 19, 2023

### CAR T-Cell III

It would be best to induce central memory T-cells rather than effector memory T-cells, but I think memory phenotype of either type might do.

### convertible CARs

This second post is a semi-crazy idea about making off the shelf CAR T-cells rather than modifying cells from the patient. The cost of the patient specific therapy is not prohibitive even now and should go down the learning curve. However, my proposal of multiple modifications would add to the cost and why do them again and again ?

So the idea is to make a CAR T-cell line which will not be rejected by the patient even though the CAR T-cells are made with someone else's T-cells with different surface antigents especially different HLA antigens. Long ago my late father thought of deleting the Beta 2 microglobulin gene so that HLA A B and C would not be expressed on the surface. Here I make a much more radical proposal (which will never be allowed so it is just for a blog post)

The off the shelf CAR can be designed to express the do not kill me signal PDL1. As I already proposed that the receptor PD1 be deleted, these cells will not tell each other not to kill. I think that these cells could be infused into anyone and would function. They would also be dangerous - if some became leukemic dealing with them would have to include anti PDL1. Recall that I propose inserting Herpes TK into the super CAR T-cells so that they can be killed, if necessary, with gangcyclovir. That would be even more clearly needed with the PDL1 expressing super CAR T-cells.

## Wednesday, October 18, 2023

### Hot ROd CARs

This approach has been very successful in treating Leukemia, but not so successful in treating solid tumors -- the tumor micro environment is not hospitable to killer T-cells. There are a large number of known aspects of the tumor micro-environment which tend to protect tumors from activated killer T-cells

1) Perhaps the most important is myeloid derived suppressor cells -- these are immature granualicytes and macrophages which are attracted to the tumor. Among other things, they produce anti-inflamatory IL-10, and also produce the free radical Nitric Oxide (NO).

2) Tumor inflitrating T-regs which produce and display anti inflammatory TGF beta.

3) Cancer cells display checkpoint "don't kill me signals" including PDL1 and CTLA4 ligand.

4) There are generally low Oxygen, low glucose, low Ph, and high lactic acid levels.

Many of the issues involve specific interaction with specific receptors on the t-cells (eg PD1, CTLA4, IL10 receptor, TGF beta receptor). I think that, since one is already genetically modifying the t-cells, one can also delete those receptors so they do not respond to the anti-inflamatory signals. The NO issue is different -- it is a non specific oxidizing agent. I think here one can make cells which always produce the antioxidant response by deletign KEAP1 which inactivates NRF2 which triggers the anti oxidant response.

So I think it is possible to produce souped up CARs which invade solid tumors.

There is a potential risk of putting killer t-cells which can't be regulated into a patient, so I would also insert the gene for herpes TK so they can be specifically killed by gancyclovir.

This approach makes sense to me. It involves a whole lot of work aiming at a possible future approval of a clinical trial. I can see why it hasn't been done (and will have another post about reducing the cost and effort involved) but I think it makes sense to try.

## Monday, October 02, 2023

### MMLF Founding Manifesto

The liberation of such mosquitoes is one way to fight malaria. They (and similarly modified members of other species of anopheles mosquitoes) can eliminate malaria.

However they can't do that imprisoned in lab cages. They are not released because of who ? WHO. It is agreed that the important and allegedly for some reason risky decision must be made after careful thorough consideration and that release occur only when all affected countries (which are numerous as mosquitoes don't respect international boundaries) agree.

That is probably roughly never and certainly not until there have been millions more un-necessary deaths.

I think the modified mosquitoes should be liberated using any means necessary.

## Saturday, March 19, 2022

### Elisabeth, Essex, and Liberty Valence in Lammermoor

## Thursday, March 04, 2021

### Dr Seuss & Brain Washing

## Sunday, February 28, 2021

### Who to be mad at

by hand

## Friday, December 18, 2020

So Far the efficacy data has been presented. As reported in the press earlier, the vaccine is roughly 95% effective, that is roughly 95% of people who got Covid 19 during the trial were participants who received the placebo.

Importantly, the null hypothesis that just one dose is just as good as two was not rejected. The test of this null had extremely low power as almost all participants received both doses, so basically this means cases less than 4 weeks after the first dose. However, note the extreme rigidity of the FDA.

Before allowing vaccination, the FDA required proof of efficacy. Before allowing a modification from two doses 4 weeks apart to one dose, the FDA requires … I don’t know maybe if Jesus Christ returned and petitioned them for some flexibility, they would give Him a hearing, but I guess they would tell him he needed to propose (and fund) a new Phase III trial.

It is also true that there is no evidence of benefit from the second dose of Pfizer’s vaccine. It is clear that people who have received one dose of either vaccine are among those least at risk of Covid 19.

The vaccines are in very short supply. People are anxiously waiting for vaccination. Because the protocol had two doses, half of the vaccine will be reserved for the people who will benefit least.

Here there is a difference between careful science and optimal policy. In science it is crucial to write the protocol first then follow it mechanically. This is necessary so that the experimental interventions are exogenous and one can be sure they cause the observed outcomes and are not caused by observations.

However, it is not optimal policy to reduce the possible decisions to two, a priori with extremely limited data. This is what the FDA does. I think they should approve a single dose. Their rule is always to only act on extremely firm knowledge. It is, in this case, not going to be first do no harm. The second dose has side effects (mild but not zero). There is, I think, no evidence of benefits. (Again, the test has extremely low power (and I’m not sure protocol did not say the question would be addressed — if it didn’t then there is a problem — the rule decide what to do in advance applies to data analysis too — it is vital that the data not be dredged looking for a significant coefficient)). I think the point estimate is pretty much exactly zero benefit.

I think that people should be given a single dose. After everyone who wants one dose has been vaccinated, then it makes sense to give people a second dose. There is no reason to think spacing 4 weeks apart is optimal — the spacing was decided in advance.

Next speaker discussed safety. There is 0 evidence that vaccination increases the risk of anaphalactic shock. There were two cases one person who suffered anaphalaxis received placebo and one received the vaccine. The most common side effect was pain. There were no cases of severe side effects. People with a history of anaphalaxis were *not* excluded from the study.

Now a third speaker argues for unblinding the study and giving the vaccine to participants who were given the placebo. They can drop out and just get the vaccine when it is their turn. Losing the control group is not ideal but attrition will make it useless soon anyway (people will not settle for 50% chance they were vaccinated when the vaccine is approved — probably tomorrow). I agree, they have enough data and it is not ethical to leave people unvaccinated just as a control group.

Now they open for discussion with a few members of the public allowed to ask questions (the law requires this). I muted. Now they have taken a pause.

My question is why not give people just one dose until everyone who wants it has been vaccinated once ? I see no basis at all for allocating the scarce vaccine to a second dose. The scientific method does not say that optimal policy requires sticking to a protocol written before data were collected. The first do no harm principle (which I absolutely oppose in general) would imply giving one dose until there is evidence of benefit of a second dose.

Consider the case of tests for Covid 19. The test kits sent out by the CDC contained powder in tubes. One tube was the positive control — it was supposed to containt DNA with sequences corresponding to the Sars Cov2 RNA genome sequences. The tubes which were supposed to contain one of 3 oligonucleotides to be used. was contaminated with traces of that DNA. The result was that the kit as shipped reported that distilled water was infected with Sars Cov2. The hospital labs which got the kits almost immediated figured out that they could test with valid results if they didn’t use the material in the contaminated tubes, and just used 2 oligonucleotides. They could determine who had Covid 19 using the kit. But that was a modified protocol which was not FDA approved, so the FDA did not allow them to do this. The FDA also did not approve dozens of tests which were developed by the private sector.

Here the FDAs decision that they would rather be safe than sorry kept the US blind to Covid for … I think maybe a couple of weeks. Don’t look, because you haven’t proven that your glasses have exactly the right prescription is not good advice to someone on a highway. This was a very bad problem. I think the lesson learned is not that even the CDC lab sometimes makes mistakes. It was that rigidity and refusing permission is not the way to safetly.

Since then, I have been very favorably impressed by the FDAs efforts. But today I want more — I mean less — I mean approving less and allowing more flexibility. I see no case for insisting on giving people second doses with almost exactly zero evidence of efficacy. I see no case for reserving vaccine for the people who are least at risk of Covid 19. Yet I see no chance that a single dosage will be allowed.

Usual rant

In previous posts, I object to the confusion of the pure food and drug act with the scientific method. I note that it is simply a mistake to assert that the null hypothesis is to be treated as true until it is rejected by the data. The law says drugs are assumed ineffective until they are proved effective. That is US law not the scientific method. In general the decision of which of 2 hypotheses to treat as the null is arbitrary and should have no implications. I am not a scientist, but I am familiar with the Neyman Pearson framework and I consider my claims about the meaning of “null hypothesis” to be as solid as my assessment of 2+2. Both are simple math

## Friday, October 02, 2020

### Constitutional Nit Picking

In fact, we can blame the delegates at the Constitutional Convention (as well as the 7th Congress) for that particular offence against Democracy. Back in 1800, The Constitution Article II Section 1 included "But in chusing the President, the Votes shall be taken by States, the Representation from each State having one Vote;"

The one state one vote rule does appear in the 12th Amendment, but it was already in the original Constitution.

A more important point is that this is only relevant if there is a 269-269 tie in the electoral college. The 12th amendment also says " The person having the greatest number of votes for President, shall be the President, if such number be a majority of the whole number of Electors appointed;" Notice "Electors appointed" not "More than the number of states plus half the number of representatives" or currently more than 269.

It is (still in spite of everything) inconceivable that the race be called before it was agreed who won the tipping point state, but if it is decided that a President elect must be declared while the winner of some state is contested, the matter will not go to the House voting one state one vote (as always results must be certified by the House voting the normal way one representative one vote).

It has not always been true that all states are represented in the electoral college. It hasn't always been true in my lifetime (I was born on November 9 1960 the day after electors were elected November 8 1960 but before those electors Kennedy). In 1960 the electors for Hawaii were never assigned because the outcome was contested when the electors voted. This means that Hawaii had to wait until 1964 to be represented in the electoral college after becoming a state on August 21, 1959.

## Tuesday, May 26, 2020

### Hobbes and Hegel

Hence the question, what do Hobbes and Hegel have in common ? I admit I know a bit about Hobbes having read the first two books of Leviathan (and I bet Hobbes's mom was too bored to read the third and fourth). About Hegel I know almost exactly nothing (and more than I would like).

They are two seminal influential writers. The vibrant discussion and debate about the social contract began with Hegel largely transmitted through Locke's attempt to refute Hobbes in his second treatise on government (I have read it but not his first treatise on government).

In each case, most people addicted to the big H's do not share their conclusions or general orientation. Some phrases and words live on (social contract, dialectic, historical age) while the orginal main point is utterly rejected.

The interesting thing is that these two genuinely revolutionary writers were reactionary. Both advocated absolute monarchy. Hobbes explicitly rejected not only the British revolution, but also the ancien regime with power divided between the King and Parliament. He regretted the defeats at Naseby and Runnymeade, he contested both Cromwell and Polybius. He slashed at the division of power as sharply as Ockham.

Hegel was not so clear (the military situation has developed not necessarily to Japan's advantage). He claimed to believe that Prussia should have a constitution, and that Prussia had a constitution.

I just had an idea. I think the extroirdinarily original thoughts are the result of attempting to defend the indefensible. There were a few obsolete arguments for absolute monarchy. The first was might makes right, which was challenged by facts on the ground. The second was the divine right of kings which was hampered by the contrast between God's stubborn silence and theologians' verbosity. Something new was needed and first the social contract then the dialectic were new. I think the radicalism of Hobbes was made necessary by his extremely reactionary factionalism. I think the extreme abstraction and vagueness of Hegel [should be discussed only by people who have actually read Hegel] was a new obscurantism needed because people had ceased to look to scripture for guidance on public policy (people starting with Hobbes).

Necessity is the mother of invention, and the painful and humiliating need to find some way to defend the pretenses of a royal patron was the mother of genius.

## Sunday, April 12, 2020

### Experts and Me

I am definitely not willing to leave health care decisions to doctors. I am not talking about my own as I am generally healthy enough. I do insist on giving advice when is often to challenge doctors and argue with them about therapy. I am going to consider a typically good column by Michael Gerson here

The dangerous conservative case against expertise. He has a point. In particular, conservatives are often wrong when they disagree with experts (also when they are experts). Gerson happens to be my favorite conservative. I don't know how long he will remain a self identified conservative. I think he will follow his fellow Washington Post conservative quota hires Jennifer Rubin and Max Boot to neoliberalism.

He writes well enough that it is hard to quote.

I note here that he doesn't have an argument against medicare for all *or* for the claim that Covid 19 shows why it would be good

a human tendency to interpret disasters as confirmation of our existing beliefs. So the coronavirus outbreak proves the need for a border wall. [skip]”Not every argument is strained or spurious. The pandemic has given our health-care system an X-ray, revealing disturbing racial inequities that need to be understood and addressed. But on the whole, we are right to be wary of people who claim great tragedies as the confirmation of pet theories and previous prophesies.

He just moves on, because his job is to be a conservative critiquing conservatives. I think there is also the point that it won't be enacted and so it isn't on the agenda. In particular he critiques

conservatives who look at the coronavirus outbreak and see, of all things, the discrediting of experts and expertise. In this view, the failures of the World Health Organization (WHO) and the Centers for Disease Control and Prevention (CDC) have brought the whole profession into disrepute. The judgments of health professionals have often been no better than the folk wisdom of the Internet. The pandemic is not only further proof of the fallibility of insiders; it has revealed the inherent inaccessibility of medical truth. All of us, scientists and nonscientists, are walking blindly on the same misty moor and may stumble on medical insights.I think he is describing a real phenomenon and he is right to denounce it. But I have some criticisms.

First, and importantly, the internet is a medium. One can't discuss the effect of relying on the internet any more than of relying on the spoken word or books. He has something in mind, but he doesn't quite define it. I think he is thinking of social media of FacebookandTwitter. I get my opinions about Covid 19 and treatments from the internet, and, in particular from

https://www.worldometers.info/coronavirus/

https://www.ncbi.nlm.nih.gov/pubmed/

and

https://clinicaltrials.gov/

Also I have read a lot about how the Wikipedia is unreliable, but know of only one gross error (the article on Ricardian equivalence).

I don't think it is possible to understand the problem without addresses the particular problem of the conservabubble. Think of edited media (the MSM). The reliability of edited news media sources varies widely. Don't trust Fox News. Don't trust the National Enquirer. Don't trust The Wall Street Journal opinion pages. Don't pretend that this is a general or symmetric problem.

My other thought is: What about the FDA ? It is, like WHO and the CDC a center of recognized expertise. It is also (unlike WHO) a center of power -- the FDA has legal authority. It has made terrible mistakes so far, and they have killed people.

In particular it is essentially 100% responsible for the delay of testing for Covid 19 in the USA. The CDC sent out test kits which had a (still not understood) problem. This would have caused at most a week of delay without the help of the FDA.

The problem appears to be with one reagent which should consist of a pair of oligonucleotides and which sometimes seems to be contaminated. I don't claim to understand what is wrong with it, but do note that it can be just left out, that the CDC test kits work fine if it is not used, and that this is one of the main ways tests are now conducted in the USA. Also all this was discovered about 1 week after the kits were mailed which was roughly 3 weeks after the Sars Cov2 sequence was published.

The further delay (roughly 2 weeks) occurred because the FDA did not authorize the use of the modified toss out the problem causing reagent test. It also placed a heavy burden for approval of tests by private agents -- hospitals, universities, and commercial testing firms.

That was the (more) fatal problem (in thousands of deaths caused).

I think there are two lessons. First there is a bias towards small c conservatism. A delay in adopting something new is always considered acceptable. This makes no sense. Another is that the expertise can be familiarity with the way things have been done (combined with the conservativism). Experts on Covid 19 testing include experts on RT-PCR and experts on obscure FDA regulations which implied that the declaration of an emergency slowed the response to the virus. The second group of experts (call them lawyers) know about current laws, rules and regulations. They don't know if they are optimal.

Notice that I haven't gotten close to saying that "medical truth" is inaccessible. One doesn't get there from questioning experts and especially not from questioning experts when they disagree.

OK I quote "And the CDC did badly mishandle the early stage of diagnostic testing." But what about the FDA ? Yes they made some mistake at the CDC, but it caused only a third of the total delay. Such problems are inevitable -- human perfection is impossible. But refusing to allow the correct solution to be used is not inevitable. By "solution" I mean solution with a solvent and solutes not including the powder which was supposed to be a pair of oligonucleotides (and may have been -- we don't know if it was contaminated).

another quote

Judgments based on that information are not infallible. But they are always preferable to the aggregate opinion of the Internet.I am quite sure there is no such thing as the aggregate opinion on the internet. Again, he is thinking of something other than the medium. Again he ignores the fact that the peer reviewed literature is available on the internet (at least abstracts). I think he is on to somethng but the entity he has in mind is not "the internet". I'd call it "the conservabubble".

Gerson also leaves out experts. He notes the experts who warned of the pandemic and who advocated social distancing. He is right. He is also right that conservatives mostly dismissed them and that Trump is a disaster. In fact, I guess I agree with him about almost everything except his definition of "experts" and "the internet".

One aspect of the scientific literature is that extreme caution is favored. Guesses are not published (unless the person guessing is really famous). Claims to have found the answer are deliberately understated (which doesn't mean they aren't often wrong -- there is a constant struggle between the professional norms and the inclinations of enthusiasts).

But this is not appropriate as a guide for action in a crisis. The default response that we don't know and we have to do more research does not suggest what to do during that research.

The actual practice is to stick to business as usual. In medicine it is to stick to standard of care until and new therapy is proven to be superior (and the standard of proof is much stricter than the supposed proof beyond reasonable doubt which gets people locked up in prison). This makes no sense in the treatment of a disease first described 4 months ago. But it absolutely does describe actual choices made by actual doctors here "She listened patiently to Hall and expressed her concern that his suggestions did not conform to standard medical procedure or C.D.C. guidelines." The patient had Covid 19. How could there be standard medical procedure already ?

It is definitely a fact that doctors are very determined to define a standard of care (which is described in a document which in cases with which I am familiar is written by a private voluntary association of physicians). The reason for this is, I think, perfectly clear. If a patient is given care according to the standard, then bad outcomes do not imply malpractice liability. This means that sticking to the standard is sensible for the cautious physician who cares a lot about not being sued for malpractice and considers that more important than ding what, according to a posterior probability distribution updated with the available data, maximizes expected patient welfare.

This is the way it is done. The key experts include lawyers who explain malpractice liability (correctly).Then there are the standards of care, the official guidelines, whose key roles include defence in malpractice suits.

I assert that there is a problem here -- a moral problem and a policy problem.

I have been asserting this for years. As Gerson might predict, I now assert that the Covid 19 epidemic proves I was right.

## Friday, March 27, 2020

### Giving Boot the Boot

The test you cite had zero power. Not as high as one in a trillion, but 0, nada, zip, niente.

The authors did Not conclude what you claim they concluded. The paper is in Chinese but has an English abstract which you should have read before making (incorrect) claims including the word “concluded”.

In fact the authors concluded that the prognosis of Covid 19 (with conventionsl therapy) is good and that studies with larger sample sizes are needed. I am not quoting, but I am paraphrasing with some care and not with total ignorance as you did.

Also I am not shocked thst the self declared pro life party is not pro life. They don’t care. That’s why many of them are blood thirsty hawks (I am not saying all hawks are blood thirsty and trust that you have good intentions but don’t you remember “more rubble less trouble J Podhoretz, and “Bomb bomb bomb, Bomb bomb Iran” J McCain ?

where were you hiding ? (OK I know the conservabubble is airtight)

## Saturday, October 26, 2019

### To Be Sure Jennifer Senior

First it begins "It’s that time of the campaign season when some Democrats are starting to feel — as President Jimmy Carter might have put it — malaise." This is a reference to anectdotal evidence. It is not supported by polls of voter interest and enthusiasm or data on the number of campaign contributions. Basically, it is Senior arguing that Democrats need her advice (the pundit's fallacy). Mainly it is a link to another op-ed by Jonathan Martin which begins " When a half-dozen Democratic donors gathered at the Whitby Hotel in Manhattan last week, " so the sample size is 6. Also the "Democrats" in question are rich Democrats. Senior does not mention the possibility that rich people are out of touch with the forgotten man.

Is it the New York Times' official postion that 6 rich people in Manhatten deserve more attention than say the 940,000 people who donated to Warren or the 1.4 million people who donated to Sanders in the third quarter (of the year *before* election year). If that is malaise, I don't think I could handle enthusiasm. I guess the idea is that 6 rich people in New York are more sophisticated than the small donors, because the 6 rich people are pragmatists who consider the pulse of the nation not their enthusiasm and so are more in touch with ordinary voters than regular people are.

Then "They’re staring at their 2020 lineup and wondering whether it’s a guaranteed recipe for buyer’s remorse. Joe Biden is too old, Pete Buttigieg is too young, Kamala Harris is too uncertain, Bernie Sanders too unpalatable, Elizabeth Warren too **unelectable**." Warren unelectable as shown by the polls all of which show her leading Trump. Or the rising enthusiasm for her campaign (she does better with people who have paid more attention -- that's hard data -- a lot of people won't pay much atttention over the next year and a month but they will pay more than they have -- people who put their money where their mouth is bet that she is electable). This is nonsense. The New York Times is in touch with rich Democrats who are ambivalent about Sanders and Warren, because they don't want to pay higher taxes. I don't like writing like a vulgar Marxist, but sometimes you people make it hard not to.

Then some concessions that both sides have their faults and conservatives are not wrong about everything. Quickly on Republicans vs Democrats, their is a definite assertion of wrong doing followed by a statement that Senior doesn't have any evidence "Of course Democratic politicians — all politicians — distort, gerrymander evidence, even lie and apply their greasy thumbs to the scales. (What was Bill Clinton doing on that plane with Loretta Lynch in 2016?)" or to summarize "Of course ... ?" I trust any reader can see the problem, when one is stating the obvious, one does not need to end one's sentence with a question mark. Also Clinton derangement syndrome.

Then some real Ballance. Senior argues that Fox news is different in kind from the New York Times (correct). Then to be sures. She demonstrates an amazing lack of critical facility when discussing her employer

And you have partisan news outlets with zero interest in reporting the basic facts of Trump’s corruption or the catastrophic consequences of his impulses. We’ve gone from Pax Americana to Fox Americana in the blink of an eye.The lady dost protest too little. Even when asserting imbalance, she assumes that there must be some weight in each pan. She argues that the New York Times isn't as far biased left as Fox is biased right. She doesn't even consider the possibility that, afraid of "unconcious bias" they over compensate. Let's go down her list.Whereas the more traditional news media, whatever their unconscious biases, do try to hold Democrats to account. Sure, let’s stipulate that there are more liberals than conservatives at these organizations. Maybe even a lot more. But it was mainstream newspapers that broke the Whitewater story, which led to an independent investigation of Bill Clinton. It was mainstream newspapers that kept Hillary Clinton’s emails on the front page in the run-up to the 2016 election. This newspaper covered Hunter Biden’s business dealings in Ukraine too — in May. These pages also ran an editorial about it. That was in 2015.

"Whitewater" including omission of the critical fact that the Whitewater Development Corporatino is older than the Morgan Guarantee S&L (something I learned in the 21st century). Deliberate deception of readers by omission of a critical fact used to create the appearance of a scandal. Yes the tried to hold Clinton to some sort of "account". They also cooked the books. The episode was disgraceful. But not as damaging as keeping "Hillary Clinton’s emails on the front page in the run-up to the 2016 election." and reporting the final conclusion of no wrong doing on the 16th. A catastrophic failure of editorial judgment based on the terror of conservatives accusing them of liberal bias and the problem that the facts had an overwhelming liberal bias. Then Hunter Biden's business dealings, because private citizen Hunter Biden is so important. Joe Biden's exemplary devotion to the public interest even when it conflicted with his son's interests was not mentioned in the appalling article in which the focus was not on Trump's impeachable conduct but on the hint of a possibility of alleged wrongdoing by Biden.

If Senior's aim was to show how the New York Times has sacrificed its journalistic standards in a hopeless effort to please conservatives, the paragraph would make sense. But the repeated disgraceul betrayals of journalism are presented as exculpatory evidence.

## Sunday, September 22, 2019

### unsubscribe

Mom has ceased to use e-mail. The problem is that her AOL inbox is always full of spam. There oughta be a law, actually two.

update:3

update 2:4

1) By law, if there is an e-mail list, there must be a prominently displayed one-click unsubscribe button. It must appear before the body of the e-mail. It must be in the largest font used anywhere in the e-mail. I am here blogging because I am sick and tired of scrolling down to fine "unsubscribe" at the end of a long unsolicited e-mail. More importantly, it must be one click and your done. I have found that clicking unsubscribe often takes me to a page where I am invited to subscribe and if I just click through I am not unsubscribed. To unsubscribe I have to scroll down again. Also it not allowed to ask people why they unsubscribed.

The penalty should be $10 per violation. This will bankrupt all non compliant spammers. Justice Department or any user with a valid complaint can bring suit. If the suit is started by a harmed person, that person gets 10% of the fine for the public service. I mean don't members of Congress get spam ? I guess they also send a lot of spam, but they can exempt themselves from the law. 2) no more than 1000 unsolicited e-mails a year allowed. This is a limit on any entity which sends e-mails which is defined as the beneficial owner of the sending e-mail account. Spam does not have to be legal. Spam doesn't even have to be. We are bombarded with ads on the web all the time. Most are not spam e-mail and most don't create trouble. This is law 2, because I fear my mom accidentally subscribed to the spam mailing lists.

3) "no reply" e-mails are not allowed. The Postal Service will not deliver a letter without a return address. E-mail always includes the address of the sender. Many senders write "do not reply to this e-mail". That should be a civil offence and also a tort costing $1000 for every e-mail which contains that text or text to that effect. If someone sends an e-mail, that person should be required to consider replies. For example "don't e-mail me again" should be an order which must be obeyed, also if it is sent as a reply e--mail. Failure to obey such an instruction should cost $10,000 per infraction. If the e-mailers says it would be an extremely burdensome expense to hire people just to read replies to our e-mail, it should be politely explained to them that this is exactly the point. They should not be able to burden others with e-mail without being burdened by the replies. This one was almost too obvious to include (it is an update). I think it is perfectly reasonable for me to be able to reply to an e-mail which says "do not reply". I also think the reply should be "you now owe me $1000.00 credit my account number N at bank with routing number M (N and M not for blogger, because even though my bank has security which makes online banking impractical for me, kidz theze dayz probably know how to use those numbers to take my money). Sending an e-mail which includes the order "do not reply to this e-mail" is unacceptably rude and should be illegal. Why is it allowed

4) Unsubscription must be instant. If an e-mail is sent more than 137 milliseconds after the unsubscribe message was received, the sender must pay a fine of $1,000,000,000. It is perfectly possible to manage this these days. It is also possible to enforce this regulation. Be it so.

This post isn't supposed to be an anti Microsoft rant, but just one little paragraph. Their problem is that they make profits too easily. With Windows and Office, it's as if they patented the alphabet or Arabic numbers or something. They could just coast forever. But they won't. So they attempted to engulf the whole software industry, got nailed by the Justice Department and saved by the Supreme Court in the Microsoft relevant case of Bush V Gore. They aren't doing that any more (Bill Gates decided to fight viruses, bacteria and protazoans not people and so long as he focuses his ruthless determination on single celled organisms I support him). But they won't admit that they are just an intellectual property scam, so they keep "improving" their software making it worse and worse. This forces customers to learn how to use the new software which can't be as familiar as the old software, and so must be less user friendly (especially the "user friendly" aspects like remember that damn paper clip with eyes ?). Also it is slow. There is a race as the hardware gets better and better and the software gets worse and worse. I am quite sure this is deliberate (and logical). The latest software makes the latest hardware run so slowly that it is barely tolerable. So you can steal it and install it on your old computer, but you don't want to, because then your old computer is intolerably slow. The ancestor to the computer on which I am typing, which lived on this very desk, was killed by a (legal) upgrade from Windows 3.1 to Windows 95. There is a law, call it More's law, that no matter how powerful the hardware gets, it always takes the same amount of time for personal computers to start up. Notice that it takes no time for smart phones to start up. Why ? OK also the format of say *.doc files is changed (unless people know about save as which most don't) so you need the latest applications to edit documents, so you need the latest operating system so Windows is rich. OK fine, make sure software runs only on new computers bought with Windows pre-installed. But please slow it down by using it to mine BitCoin for Microsoft or something and leave the user interface alone.