less_retarded_wiki

by drummyfish, generated on 10/21/23, available under CC0 1.0 (public domain)


21st_century

21st Century

21st century, known as the Age Of Shit, is already one of the worst centuries in history despite only being around shortly.


3d_modeling

3D Modeling

The topic of 3D modeling will be part of article about 3D models.


3d_model

3D Model

In the world of computers (especially in computer graphics, but also e.g. in physics simulations etc.) 3D model is a representation of a three dimensional object, for example of a real life object such as a car, tree or a dog, but also possibly something more abstract like a fractal or function plot surface. 3D models can be displayed using various 3D rendering techniques and are used mostly to simulate real world on computers (e.g. games), as real world is, as we know, three dimensional. 3D models can be created in several ways, e.g. manually with 3D modelling software (such as Blender) by 3D artists, by 3D scanning real world objects, or automatically by procedural generation.

3D models can be represented in many ways -- the mainstream "game" 3D models that most people are used to seeing are polygonal (made of triangles) boundary-representation (recording only surface, not volume) textured (with "pictures" on their surface) 3D models, but keep in mind many different ways are possible and used too, for example various volume representations, voxel models, point clouds, implicit surfaces, spline surfaces, constructive solid geometry, wireframe etc. Models may also bear additional extra information and features, e.g. bone rigs for animation, animation key frames, density information, LODs and so on.

TODO: classification, operations (subdivision, booleans, ...), texturing, animation, formats, ...

3D Modeling: Learning It And Doing It Right

WORK IN PROGRESS

Do you want to start 3D modeling? Or do you already know a bit about it and just want some advice to get better? Then let us share a few words of advice here.

Nowadays as a FOSS user you will most likely do 3D modeling with Blender -- we recommended it to start learning 3D modeling as it is powerful, free, gratis, has many tutorials etc. Do NOT use anything proprietary no matter what anyone tells you! Once you know a bit about the art, you may play around with alternative programs or approaches (such as writing programs that generate 3D models etc.). However as a beginner just start with Blender, which is from now on in this article the software we'll suppose you're using.

Start extremely simple and learn bottom-up, i.e. learn about fundamentals and low level concepts and start with very simple models (e.g. simple untextured low-poly shape of a house, box with a roof), keep creating more complex models by small steps. Do NOT fall into the trap of "quick and easy magic 3D modeling" such as sculpting or some "smart apps" without knowing what's going on at the low level, you'll end up creating extremely ugly, inefficient models in bad formats, like someone wanting to create space rockets without learning anything about math or physics first. Remember to practice, practice, practice -- eventually you learn by doing, so try to make small projects and share your results on sites such as opengameart to get feedback and some mental satisfaction and reward for your effort. The following is an outline of possible steps you may take towards becoming an alright 3D artist:

  1. Learn what 3D model actually is, basic technical details about how a computer represents it and roughly how 3D rendering works. It is EXTREMELY important to have at least some idea about the fundamentals, i.e. you should learn at least the following:
  1. Manually create a few extremely simple low-poly untextured models, e.g. that of a simple house, laptop, hammer, bottle etc. Keep the vertex and triangle count very low (under 100), make the model by MANUALLY creating every vertex and triangle and focus only on learning this low level geometry manipulation well (how to create a vertex, how to split an edge, how to rotate a triangle, ...), making the model conform to good practice and get familiar with tools you're using, i.e. learn the key binds, locking movement direction to principal axes, learn manipulating your 3D view, setting up the free/side/front/top view with reference images etc. Make the model nice! I.e. make it have correctly facing triangles (turn backface culling on to check this), avoid intersecting triangles, unnecessary triangles and vertices, remove all duplicate vertices (don't have multiple vertices with the same position), connect all that should be connected, avoid badly shaped triangles (e.g. extremely acute/long ones) etc. Also learn about normals and make them nice! I.e. try automatic normal generation (fiddle e.g. with angle thresholds for sharp/smooth edges), see how they affect the model look, try manually marking some edges sharp, try out smoothing groups etc. Save your final models in OBJ format (one of the simplest and most common formats supporting all you need at this stage). All this will be a lot to learn, that's why you must not try to create a complex model at this stage. You can keep yourself "motivated" e.g. by aiming for creating a low-poly model collection you can share at opengameart or somewhere :)
  2. Learn texturing -- just take the models you have and try to put a simple texture on them by drawing a simple image, then unwrapping the UV coordinates and MANUALLY editing the UV map to fit on the model. Again the goal is to get familiar with the tools and concepts now; experiment with helpers such as unwrapping by "projecting from 3D view", using "smart" UV unwrap etc. Make the UV map nice! Just as model geometry, UV maps also have good practice -- e.g. you should utilize as many texture pixels as possible (otherwise you're wasting space in the image), watch out for color bleeding, the mapping should have kind of "uniform pixel density" (or possibly increased density on triangles where more details is supposed to be), some pixels of the texture may be mapped to multiple triangles if possible (to efficiently utilize them) etc. Only make a simple diffuse texture (don't do PBR, material textures etc., that's too advanced now). Try out texture painting and manual texture creation in a 2D image program, get familiar with both.
  3. Learn modifiers and advanced tools. Modifiers help you e.g. with the creation of symmetric models: you only model one side and the other one gets mirrored. Subdivide modifier will automatically create a higher poly version of your model (but you need to help it by telling it which sides are sharp etc.). Boolean operations allow you to apply set operations like unification or subtraction of shapes (but usually create a messy geometry you have to repair!). There are many tools, experiment and learn about their pros and cons, try to incorporate them to your modeling.
  4. Learn retopology and possibly sculpting. Topology is an extremely important concept -- it says what the structure of triangles/polygons is, how they are distributed, how they are connected, which curves their edges follow etc. Good topology has certain rules (e.g. ideally only being composed of quads, being denser where the shape has more detail and sparser where it's flat, having edges so that animation won't deform the model badly etc.). Topology is important for efficiency (you utilize your polygon budget well), texturing and especially animation (nice deformation of the model). Creating more complex models is almost always done in the following two steps:
  1. Learn about materials and shaders. At this point you may learn about how to create custom shaders, how to create transparent materials, apply multiple textures, how to make realistic skin, PBR shaders etc. You should at least be aware of basic shading concepts and commonly encountered techniques such as Phong shading, subsurface scattering, screen space effects etc. because you'll encounter them in shader editors and you should e.g. know what performance penalties to expect.
  2. Learn animation. First learn about keyframes and interpolation and try to animate basic transformations of a model, e.g. animate a car driving through a city by keyframing its position and rotation. Then learn about animating the model's geometry -- first the simple, old way of morphing between different shapes (shape keys in Blender). Finally learn the hardest type of animation: skeletal animation. Learn about bones, armatures, rigging, inverse kinematics etc.
  3. Now you can go crazy and learn all the uber features such as hair, physics simulation, NURBS surfaces, boob physics etc.

Don't forget to stick to LRS principles! This is important so that your models are friendly to good technology. I.e. even if "modern" desktops don't really care about polygon count anymore, still take the effort to optimize your model so as to not use more polygons that necessary! Your models may potentially be used on small, non-consumerist computers with software renderers and low amount of RAM. Low-poly is better than high-poly (you can still prepare your model for automatic subdivision so that obtaining a higher poly model from it automatically is possible). Don't use complex stuff such as PBR or skeletal animation unless necessary -- you should mostly be able to get away with a simple diffuse texture and simple keyframe morphing animation, just like in old games! If you do use complex stuff, make it optional (e.g. make a normal map but don't rely on it being used in the end).

Good luck with your modeling!


3d_rendering

3D Rendering

In computer graphics 3D rendering is concerned with computing images that represent a projected view of 3D objects through a virtual camera.

There are many methods and algorithms for doing so differing in many aspects such as computation complexity, implementation complexity, realism of the result, representation of the 3D data, limitations of viewing and so on. If you are just interested in the realtime 3D rendering used in gaymes nowadays, you are probably interested in GPU-accelerated 3D rasterization with APIs such as OpenGL and Vulkan.

LRS has a 3D rendering library called small3dlib.

Methods

As most existing 3D "frameworks" are harmful, a LRS programmer is likely to write his own 3D rendering system that suits his program best, therefore we should list some common methods of achieving 3D. Besides that, it's just pretty interesting to see what there is in the store.

Rendering spectrum: The book Real-Time Rendering mentions that methods for 3D rendering can be seen as lying on a spectrum, one extreme of which is appearance reproduction and the other physics simulation. Methods closer to trying to imitate the appearance try to simply focus on imitating the look of an object on the monitor that the actual 3D object would have in real life, without being concerned with how that look arises in real life -- these may e.g. use image data such as photographs; these methods may rely on lightfields, photo textures etc. The physics simulation methods try to replicate the behavior of light in real life -- their main goal is to solve the rendering equation, usually only approximately -- and so, through internally imitating the same processes, come to similar visual results that arise in real world: these methods rely on creating 3D geometry (e.g. that made of triangles or voxels), computing light reflections and global illumination. Most methods lie somewhere in between these two extremes: for example billboards and particle systems may use a texture to represent an object while at the same time using 3D quads (very simple 3D models) to correctly deform the textures by perspective and solve their visibility. The classic polygonal 3D models are also usually somewhere in between: the 3D geometry and shading are trying to simulate the physics, but e.g. a photo texture mapped on such 3D model is the opposite appearance-based approach (PBR further tries to shift the use of textures more towards the physics simulation end).

A table of some common 3D rendering methods follows, including the most simple, most advanced and some unconventional ones. Note that here we talk about methods and techniques rather than algorithms, i.e. general approaches that are often modified and combined into a specific rendering algorithm. For example the traditional triangle rasterization is sometimes combined with raytracing to add e.g. realistic reflections. The methods may also be further enriched with features such as texturing, antialiasing and so on. The table below should help you choose the base 3D rendering method for your specific program.

The methods may be tagged with the following:

method notes
3D raycasting IO off, shoots rays from camera
2D raycasting IO 2.5D, e.g. Wolf3D
AI image synthesis "just let AI magic do it"
beamtracing IO off
billboarding OO
BSP rendering 2.5D, e.g. Doom
conetracing IO off
"dungeon crawler" OO 2.5D, e.g. Eye of the Beholder
ellipsoid rasterization OO, e.g. Ecstatica
flat-shaded 1 point perspective OO 2.5D, e.g. Skyroads
reverse raytracing (photon tracing) OO off, inefficient
image based rendering generally using images as 3D data
light fields image-based, similar to holography
mode 7 IO 2.5D, e.g. F-Zero
parallax scrolling 2.5D, very primitive
pathtracing IO off, Monte Carlo, high realism
portal rendering 2.5D, e.g. Duke3D
prerendered view angles 2.5D, e.g. Iridion II (GBA)
raymarching IO off, e.g. with SDFs
raytracing IO off, recursive 3D raycasting
segmented road OO 2.5D, e.g. Outrun
shear warp rednering IO, volumetric
splatting OO, rendering with 2D blobs
texture slicing OO, volumetric, layering textures
triangle rasterization OO, traditional in GPUs
voxel space rendering OO 2.5D, e.g. Comanche
wireframe rendering OO, just lines

TODO: Rescue On Fractalus!

TODO: find out how build engine/slab6 voxel rendering worked and possibly add it here (from http://advsys.net/ken/voxlap.htm seems to be based on raycasting)

TODO: VoxelQuest has some innovative voxel rendering, check it out (https://www.voxelquest.com/news/how-does-voxel-quest-work-now-august-2015-update)

3D Rendering Basics For Nubs

If you're a complete noob and are asking what the essence of 3D is or just how to render simple 3Dish pictures for your game without needing a PhD, here's the very basics. Yes, you can use some 3D engine such as Godot that has all the 3D rendering preprogrammed, but you you'll surrender to bloat, you won't really know what's going on and your ability to tinker with the rendering or optimizing it will be basically zero... AND you'll miss on all the fun :) So here we just foreshadow some concepts you should start with if you want to program your own 3D rendering.

The absolute basic thing in 3D is probably perspective, or the concept which says that "things further away look smaller". This is basically the number one thing you need to know and with which you can make simple 3D pictures, even though there are many more effects and concepts that "make pictures look 3D" and which you can potentially study later (lighting, shadows, focus and blur, stereoscopy, parallax, visibility/obstruction etc.). { It's probably possible to make something akin "3D" even without perspective, just with orthographic projection, but that's just getting to details now. Let's just suppose we need perspective. ~drummyfish }

If you don't have rotating camera and other fancy things, perspective is actually mathematically very simple, you basically just divide the object's size by its distance from the viewer, i.e. its Z coordinate (you may divide by some multiple of Z coordinate, e.g. by 2 * Z to get different field of view) -- the further away it is, the bigger number its size gets divided by so the smaller it becomes. This "dividing by distance" ultimately applies to all distances, so in the end even the details on the object get scaled according to their individual distance, but as a first approximation you may just consider scaling objects as a whole. Just keep in mind you should only draw objects whose Z coordinate is above some threshold (usually called a near plane) so that you don't divide by 0! With this "dividing by distance" trick you can make an extremely simple "3Dish" renderer that just draws sprites on the screen and scales them according to the perspective rules (e.g. some space simulator where the sprites are balls representing planets). There is one more thing you'll need to handle: visibility, i.e. nearer objects have to cover the further away objects -- you can do this by simply sorting the objects by distance and drawing them back-to-front (painter's algorithm).

Here is some "simple" C code that demonstrates perspective and draws a basic animated wireframe cuboid as ASCII in terminal:

#include <stdio.h>

#define SCREEN_W 50       // ASCII screen width
#define SCREEN_H 22       // ASCII screen height
#define LINE_POINTS 64    // how many points for drawing a line
#define FOV 8             // affects "field of view"
#define FRAMES 30         // how many animation frames to draw

char screen[SCREEN_W * SCREEN_H];

void showScreen(void)
{
  for (int y = 0; y < SCREEN_H; ++y)
  {
    for (int x = 0; x < SCREEN_W; ++x)
      putchar(screen[y * SCREEN_W + x]);

    putchar('\n');
  }
}

void clearScreen(void)
{
  for (int i = 0; i < SCREEN_W * SCREEN_H; ++i)
    screen[i] = ' ';
}

// Draws point to 2D ASCII screen, [0,0] means center.
int drawPoint2D(int x, int y, char c) 
{
  x = SCREEN_W / 2 + x;
  y = SCREEN_H / 2 + y;

  if (x >= 0 && x < SCREEN_W && y >= 0 && y <= SCREEN_H)
    screen[y * SCREEN_W + x] = c;
}

// Divides coord. by distance taking "FOV" into account => perspective.
int perspective(int coord, int distance)
{
  return (FOV * coord) / distance;
}

void drawPoint3D(int x, int y, int z, char c)
{
  if (z <= 0)
    return; // at or beyond camera, don't draw

  drawPoint2D(perspective(x,z),perspective(y,z),c);
}

int interpolate(int a, int b, int n)
{
  return a + ((b - a) * n) / LINE_POINTS;
}

void drawLine3D(int x1, int y1, int z1, int x2, int y2, int z2, char c)
{
  for (int i = 0; i < LINE_POINTS; ++i) // draw a few points to form a line
    drawPoint3D(interpolate(x1,x2,i),interpolate(y1,y2,i),interpolate(z1,z2,i),c);
}

int main(void)
{
  int shiftX, shiftY, shiftZ;

  #define N 12  // side length
  #define C '*'

  // cuboid points:
  //      X                   Y            Z
  #define PA -2 * N + shiftX, N + shiftY,  N + shiftZ
  #define PB 2 * N + shiftX,  N + shiftY,  N + shiftZ
  #define PC 2 * N + shiftX,  N + shiftY,  2 * N + shiftZ
  #define PD -2 * N + shiftX, N + shiftY,  2 * N + shiftZ
  #define PE -2 * N + shiftX, -N + shiftY, N + shiftZ
  #define PF 2 * N + shiftX,  -N + shiftY, N + shiftZ
  #define PG 2 * N + shiftX,  -N + shiftY, 2 * N + shiftZ
  #define PH -2 * N + shiftX, -N + shiftY, 2 * N + shiftZ

  for (int i = 0; i < FRAMES; ++i) // render animation
  {
    clearScreen();

    shiftX = -N + (i * 4 * N) / FRAMES; // animate
    shiftY = -N / 3 + (i * N) / FRAMES;
    shiftZ = 0; 

    // bottom:
    drawLine3D(PA,PB,C); drawLine3D(PB,PC,C); drawLine3D(PC,PD,C); drawLine3D(PD,PA,C);

    // top:
    drawLine3D(PE,PF,C); drawLine3D(PF,PG,C); drawLine3D(PG,PH,C); drawLine3D(PH,PE,C);

    // sides:
    drawLine3D(PA,PE,C); drawLine3D(PB,PF,C); drawLine3D(PC,PG,C); drawLine3D(PD,PH,C);

    drawPoint3D(PA,'A'); drawPoint3D(PB,'B'); // corners
    drawPoint3D(PC,'C'); drawPoint3D(PD,'D');
    drawPoint3D(PE,'E'); drawPoint3D(PF,'F');
    drawPoint3D(PG,'G'); drawPoint3D(PH,'H');

    showScreen();

    puts("press key to animate");
    getchar();
  }

  return 0;
}

One frame of the animation should look like this:

                 E*******************************F
                 * *                       ***   *
                 *  **                 ***       *
                 *   H***************G*          *
                 *   *               *           *
                 *   *               *           *
                 *   *               *           *
                 *   *               *           *
                 *   *               *           *
                 *   *               *           *
                 *   D***************C           *
                 *  **                ***        *
                 *  *                    *       *
                 * *                       **    *
                 ***                         * * *
                 A*******************************B

press key to animate

Mainstream Realtime 3D

You may have come here just to learn about the typical realtime 3D rendering used in today's games because aside from research and niche areas this kind of 3D is what we normally deal with in practice. This is what this section is about.

Nowadays "game 3D" means a GPU accelerated 3D rasterization done with rendering APIs such as OpenGL, Vulkan, Direct3D or Metal (the last two being proprietary and therefore shit) and higher level engines above them, e.g. Godot, OpenSceneGraph etc. The methods seem to be evolving to some kind of rasterization/pathtracing hybrid, but rasterization is still the basis.

This mainstream rendering uses an object order approach (it blits 3D objects onto the screen rather than determining each pixel's color separately) and works on the principle of triangle rasterization, i.e. 3D models are composed of triangles (or higher polygons which are however eventually broken down into triangles) and these triangles are projected onto the screen according to the position of the virtual camera and laws of perspective. Projecting the triangles means finding the 2D screen coordinates of each of the triangle's three vertices -- once we have thee coordinates, we draw (rasterize) the triangle to the screen just as a "normal" 2D triangle (well, with some asterisks).

Furthermore things such as z-buffering (for determining correct overlap of triangles) and double buffering are used, which makes this approach very memory (RAM/VRAM) expensive -- of course mainstream computers have more than enough memory but smaller computers (e.g. embedded) may suffer and be unable to handle this kind of rendering. Thankfully it is possible to adapt and imitate this kind of rendering even on "small" computers -- even those that don't have a GPU, i.e. with pure software rendering. For this we e.g. replace z-buffering with painter's algorithm (triangle sorting), drop features like perspective correction, MIP mapping etc. (of course quality of the output will go down).

Also additionally there's a lot of bloat added in such as complex screen space shaders, pathtracing (popularly known as raytracing), megatexturing, shadow rendering, postprocessing, compute shaders etc. This may make it difficult to get into "modern" 3D rendering. Remember to keep it simple.

On PCs the whole rendering process is hardware-accelerated with a GPU (graphics card). GPU is a special hardware capable of performing many operations in parallel (as opposed to a CPU which mostly computes sequentially with low level of parallelism) -- this is ideal for graphics because we can for example perform mapping and drawing of many triangles at once, greatly increasing the speed of rendering (FPS). However this hugely increases the complexity of the whole rendering system, we have to have a special API and drivers for communication with the GPU and we have to upload data (3D models, textures, ...) to the GPU before we want to render them. Debugging gets a lot more difficult. So again, this is bloat, consider avoiding GPUs.

GPUs nowadays are no longer just focusing on graphics, but are kind of a general device that can be used for more than just 3D rendering (e.g. crypto mining, training AI etc.) and can no longer even perform 3D rendering completely by themselves -- for this they have to be programmed. I.e. if we want to use a GPU for rendering, not only do we need a GPU but also some extra code. This code is provided by "systems" such as OpenGL or Vulkan which consist of an API (an interface we use from a programming language) and the underlying implementation in a form of a driver (e.g. Mesa3D). Any such rendering system has its own architecture and details of how it works, so we have to study it a bit if we want to use it.

The important part of a system such as OpenGL is its rendering pipeline. Pipeline is the "path" through which data go through the rendering process. Each rendering system and even potentially each of its version may have a slightly different pipeline (but generally all mainstream pipelines somehow achieve rasterizing triangles, the difference is in details of how they achieve it). The pipeline consists of stages that follow one after another (e.g. the mentioned mapping of vertices and drawing of triangles constitute separate stages). A very important fact is that some (not all) of these stages are programmable with so called shaders. A shader is a program written in a special language (e.g. GLSL for OpenGL) running on the GPU that processes the data in some stage of the pipeline (therefore we distinguish different types of shaders based on at which part of the pipeline they reside). In early GPUs stages were not programmable but they became so as to give a greater flexibility -- shaders allow us to implement all kinds of effects that would otherwise be impossible.

Let's see what a typical pipeline might look like, similarly to something we might see e.g. in OpenGL. We normally simulate such a pipeline also in software renderers. Note that the details such as the coordinate system handedness and presence, order, naming or programmability of different stages will differ in any particular pipeline, this is just one possible scenario:

  1. Vertex data (e.g. 3D model space coordinates of triangle vertices of a 3D model) are taken from a vertex buffer (a GPU memory to which the data have been uploaded).
  2. Stage: vertex shader: Each vertex is processed with a vertex shader, i.e. one vertex goes into the shader and one vertex (processed) goes out. Here the shader typically maps the vertex 3D coordinates to the screen 2D coordinates (or normalized device coordinates) by:
  1. Possible optional stages that follow are tessellation and geometry processing (tessellation shaders and geometry shader). These offer possibility of advanced vertex processing (e.g. generation of extra vertices which vertex shaders are unable to do).
  2. Stage: vertex post processing: Usually not programmable (no shaders here). Here the GPU does things such as clipping (handling vertices outside the screen space), primitive assembly and perspective divide (transforming from [homogeneous coordinates](homogeneous coordinates.md) to traditional cartesian coordinates).
  3. Stage: rasterization: Usually not programmable, the GPU here turns triangles into actual pixels (or fragments), possibly applying backface culling, perspective correction and things like stencil test and depth test (even though if fragment shaders are allowed to modify depth, this may be postpones to later).
  4. Stage: pixel/fragment processing: Each pixel (fragment) produced by rasterization is processed here by a pixel/fragment shader. The shader is passed the pixel/fragment along with its coordinates, depth and possibly other attributes, and outputs a processed pixel/fragment with a specific color. Typically here we perform shading and texturing (pixel/fragment shaders can access texture data which are again stored in texture buffers on the GPU).
  5. Now the pixels are written to the output buffer which will be shown on screen. This can potentially be preceded by other operations such as depth tests, as mentioned above.

TODO: example of specific data going through the pipeline


42

42

"HAHAHAHAHAHAHAHAHAHHHHAAA BAZINGA" --Sheldon fan

42 is an even integer with prime factorization of 2 * 3 * 7. This number was made kind of famous (and later overused in pop culture to the point of completely destroying the joke) by Douglas Adams' book The Hitchhiker's Guide to the Galaxy in which it appears as the answer to the ultimate question of life, the Universe and everything (the point of the joke was that this number was the ultimate answer computed by a giant supercomputer over millions of years, but it was ultimately useless as no one knew the question to which this number was the answer).

If you make a 42 reference in front of a TBBT fan, he will shit himself.


4chan

4chan

{ haha https://lolwut.info/comp/4chan/4chan-g.html ~drummyfish }

4chan (https://4chan.org/) is the most famous image board. As most image boards, 4chan has a nice, oldschool minimalist look, even though it contains shitty captchas for posting and the site's code is proprietary. The site tolerates a great amount of free speech up to the point of being regularly labeled "right-wing extremist site" (although bans for stupid reasons such as harmless pedo jokes are very common, speaking from experience). Being a "rightist paradise" it is commonly seen as a rival to reddit, aka the pseudoleftist paradise -- both forums hate each other to death. The discussion style is pretty nice, there are many nice stories and memes (e.g. the famous greentexts) coming from 4chan but it can also be a hugely depressing place just due to the shear number of retards with incorrect opinions.

The site consists of multiple boards, each with given discussion topic and rules. The most (in)famous boards are likely politically incorrect AKA /pol/, where most of the american school shooters hang around, and random AKA /b/ which is just a shitton of meme shitposting, porn, toxicity, fun, trolling and retardedness.

For us the most important part of 4chan is the technology board known as /g/ (for technoloGEE). Browsing /g/ can bring all kinds of emotion, it's a place of relative freedom and somewhat beautiful chaos where all people from absolute retards to geniuses argue about important and unimportant things, brands, tech news and memes, and constantly advise each other to kill themselves. Sometimes the place is pretty toxic and not good for mental health, actually it is more of a rule than an exception.

As of 2022 /g/ became unreadable, ABANDON SHIP. The board became flooded with capitalists, cryptofascists, proprietary shills, productivity freaks and other uber retards, it's really not worth reading anymore. You can still read good old threads on archives such as https://desuarchive.org/g/page/280004/.

See Also


aaron_swartz

Aaron Swartz

"I think all censorship should be deplored." --Aaron Swartz

TODO


abstraction

Abstraction

Abstraction is an important concept in programming, mathematics and other fields of science, philosophy and art, which in simple words can be described as "viewing an issue from a distance", thinking in higher-level concepts, i.e. paying less attention to fine detail so that one can see the bigger picture. In programming for example we distinguish programming languages of high and low level of abstraction, depending on how close they are "to the hardware" (e.g. assembly being low level, JavaScript being high level); in art high abstraction means portraying and capturing things such as ideas, feelings and emotions with shapes that may seem "distant", not resembling anything concrete or familiar. We usually talk about different levels of abstraction, depending on the "distance" we take in vieweing the issue at hand -- this concept may very well be demonstrated on sciences: particle physics researches the world at the lowest level of abstraction, in extreme close-up, for example by examining individual atoms that make up our brains, while biology resides at a higher level of abstraction, viewing the brain at the level of individual cells, and finally psychology shows a very high level of abstraction because it looks at the brain from great distance and just studies its behavior.

In mainstream programming education it is generally taught to "abstract as much as possible" because that's aligned with the capitalist way of technology -- high abstraction is easy to handle for incompetent programming monkeys, it helps preventing them from making damage by employing billions of safety mechanisms, it also perpetuates the cult of never stopping layering of the abstraction sandwich, creating bloat, bullshit jobs, it makes computers slower, constantly outdated and so drives software consumerism. This is extremely wrong. LRS advocates to employ only as little abstraction as needed, so as to support minimalism, i.e. too much abstraction is bad. For example a widely used general purpose programming language should basically only have as much abstraction as to allow portability, it should definitely NOT succumb high abstraction such as object obsessed programming.

In a more detailed view abstraction is not one-dimensional, we may abstract in different directions ("look at the issue from different angles"); for example functional, logic and object paradigms are different ways of abstracting from the low level, each one in different way. So the matter of abstracting is further complicated by trying to choose the right abstraction -- one kind of abstraction may work well for certain kinds of problems (i.e. solving these problems will become simple when applying this abstraction) but badly for other kinds of problems.

Let's take a look at a possible division of a computer to different levels of abstraction, from lowest to highest (keep in mind it's also possible to define the individual levels differently):


acronym

Acronym

Acronym is an abbreviation of a multiple word term by usually appending the starting letters of each word to form a new word.

Here is a list of some acronyms:

See Also


ai

Artificial Intelligence

Artificial intelligence (AI) is an area of computer science whose effort lies in making computers simulate thinking of humans and possibly other biologically living beings. This may include making computers play games such as chess, compose music, paint pictures, understand and processing audio, images and text on high level of abstraction (e.g. translation between natural languages), making predictions about complex systems such as stock market or weather or even exhibit a general human-like behavior. Even though today's focus in AI is on machine learning and especially neural networks, there are many other usable approaches and models such as "hand crafted" state tree searching algorithms that can simulate and even outperform the behavior of humans in certain specialized areas.

There's a concern that's still a matter of discussion about the dangers of developing a powerful AI, as that could possibly lead to a technological singularity in which a super intelligent AI might take control over the whole world without humans being able to seize the control back. Even though it's still likely a far future and many people say the danger is not real, the question seems to be about when rather than if.

By about 2020, "AI" has become a capitalist buzzword. They try to put machine learning into everything just for that AI label -- and of course, for a bloat monopoly.

By 2023 neural network AI has become extremely advanced in processing visual, textual and audio information and is rapidly marching on. Networks such as stable diffusion are now able to generate images (or modify existing ones) with results mostly indistinguishable from real photos just from a short plain language textual description. Text to video AI is emerging and already giving nice results. AI is able to write computer programs from plain language text description. Chatbots, especially the proprietary chatGPT, are scarily human-like and can already carry on conversation mostly indistinguishable from real human conversation while showing extraordinary knowledge and intelligence -- the chatbot can for example correctly reason about advanced mathematical concepts on a level much higher above average human. AI has become mainstream and is everywhere, normies are downloading "AI apps" on their phones that do funny stuff with their images while spying on them. In games such as chess or even strategy video games neural AI has already been for years far surpassing the best of humans by miles.


algorithm

Algorithm

Algorithm (from the name of Persian mathematician Muhammad ibn Musa al-Khwarizmi) is an exact step-by-step description of how to solve some type of a problem. Algorithms are basically what programming is all about: we tell computers, in very exact ways (with programming languages), how to solve problems -- we write algorithms. But algorithms don't have to be just computer programs, they are simply instruction for solving problems.

Cooking recipes are commonly given as an example of a non-computer algorithm, though they rarely contain branching ("if then...") and loops ("while a condition holds do ..."), the key features of algorithms. The so called wall-follower is a simple algorithm to get out of any maze: you just pick either a left-hand or right-hand wall and then keep following it. You may write a crazy algorithm basically for any kind of problem, e.g. for how to clean a room or how to get a girl to bed, but it has to be precise so that anyone can execute the algorithm just by blindly following the steps; if there is any ambiguity, it is not considered an algorithm; a vague, imprecise "hint" on how to find a solution (e.g. "to get to the airport head somewhere in this direction.") we rather call a heuristic. Heuristics are useful too and they may be utilized by an algorithm, e.g. to find a precise solution faster, but from programmer's point of view algorithms, the PRECISE ways of finding solutions, are the basic of everything.

Interesting fact: contrary to intuition there are problems that are mathematically proven to be unsolvable by any algorithm, see undecidability, but for most practically encountered problems we can write an algorithm (though for some problems even our best algorithms can be unusably slow).

Algorithms are mostly (possibly not always, depending on exact definition of the term) written as a series of steps (or instructions); these steps may be specific actions (such as adding two numbers or drawing a pixel to the screen) or conditional jumps to other steps ("if condition X holds then jump to step N, otherwise continue"). At the lowest level (machine code, assembly) computers can do just that: execute instructions (expressed as 1s and 0s) and perform conditional jumps. These jumps can be used to create branches (in programming known as if-then-else) and loops. Branches and loops are together known as control structures -- they don't express a direct action but control which steps in the algorithm will follow. All in all, any algorithm can be written with only these three constructs:

Note: in a wider sense algorithms may be expressed in other ways than sequences of steps (non-imperative ways, see declarative languages), even mathematical equations are often called algorithms because they imply the steps towards solving a problem. But we'll stick to the common meaning of algorithm given above.

Additional constructs can be introduced to make programming more comfortable, e.g. subroutines/functions (kind of small subprograms that the main program uses for solving the problem), macros (shorthand commands that represent multiple commands) or switch statements (selection but with more than two branches). Loops are also commonly divided into several types such as: counted loops, loops with condition and the beginning, loops with condition at the end and infinite loops (for, while, do while and while (1) in C, respectively) -- in theory there can only be one generic type of loop but for convenience programming languages normally offer different "templates" for commonly used loops. Similarly to mathematical equations, algorithms make use of variables, i.e. values which can change and which have a specific name (such as x or myVariable).

Practical programming is based on expressing algorithms via text, but visual programming is also possible: flowcharts are a way of visually expressing algorithms, you have probably seen some. Decision trees are special cases of algorithms that have no loops, you have probably seen some too. Even though some languages (mostly educational such as Snap) are visual and similar to flow charts, it is not practical to create big algorithms in this way -- serious programs are written as a text in programming languages.

Example

Let's write a simple algorithm that counts the number of divisors of given number x and checks if the number is prime along the way. (Note that we'll do it in a naive, educational way -- it can be done better). Let's start by writing the steps in plain English:

  1. Read the number x from the input.
  2. Set the divisor counter to 0.
  3. Set currently checked number to 1.
  4. While currently checked number is lower or equal than x:
  1. Write out the divisor counter.
  2. If divisor counter is equal to 2, write out the number is a prime.

Notice that x, divisor counter and currently checked number are variables. Step 4 is a loop (iteration) and steps a and 6 are branches (selection). The flowchart of this algorithm is:

               START
                 |
                 V
               read x
                 |
                 V
       set divisor count to 0
                 |
                 V
       set checked number to 1
                 |
    .----------->|
    |            |
    |            V                no
    |    checked number <= x ? ------.
    |            |                   |
    |            | yes               |
    |            V                   |
    |     checked number    no       |
    |       divides x ? -------.     |
    |            |             |     |
    |            | yes         |     |
    |            V             |     |
    |     increase divisor     |     |
    |       count by 1         |     |
    |            |             |     |
    |            |             |     |
    |            |<------------'     |
    |            |                   |
    |            V                   |
    |     increase checked           V
    |       number by 1     print divisor count
    |            |                   |
    '------------'                   |
                                     V             no
                             divisor count = 2 ? -----.
                                     |                |
                                     | yes            |
                                     V                |
                           print "number is prime"    |
                                     |                |
                                     |<---------------'
                                     V
                                    END

This algorithm would be written in Python as:

x = int(input("enter a number: "))

divisors = 0

for i in range(1,x + 1):
  if x % i == 0: # i divides x?
    divisors = divisors + 1

print("divisors: " + str(divisors))
 
if divisors == 2:
  print("It is a prime!")

in C as:

#include <stdio.h>                                                              
                                                                                 
int main(void)
{
  int x, divisors = 0;
                                                                
  scanf("%d",&x); // read a number

  for (int i = 1; i <= x; ++i)
    if (x % i == 0) // i divides x?
      divisors = divisors + 1;

  printf("number of divisors: %d\n",divisors);
 
  if (divisors == 2)
    puts("It is a prime!");

  return 0;
} 

and in comun as (for simplicity only works for numbers up to 9):

<- "0" -  # read X and convert to number
0         # divisor count
1         # checked number

@@
  $0 $3 > ?     # checked num. > x ?
    !@
  .

  $2 $1 % 0 = ? # checked num. divides x ?
    $1 ++ $:2   # increase divisor count
  .

  ++      # increase checked number
.

0 "divisors: " -->   # write divisor count
$1 "0" + -> 10 ->

$1 2 = ?
  0 "It is a prime" --> 10 ->
.

This algorithm is however not very efficient and could be optimized -- for example there is no need to check divisors higher than the square root of the checked value (mathematically above square root the only divisor left is the number itself) so we could lower the number of the loop iterations and so make the algorithm finish faster.

Study of Algorithms

Algorithms are the essence of computer science, there's a lot of theory and knowledge about them.

Turing machine, a kind of mathematical bare-minimum computer, created by Alan Turing, is the traditional formal tool for studying algorithms, though many other models of computation exist. From theoretical computer science we know not all problems are computable, i.e. there are problems unsolvable by any algorithm (e.g. the halting problem). Computational complexity is a theoretical study of resource consumption by algorithms, i.e. how fast and memory efficient algorithms are (see e.g. P vs NP). Mathematical programming is concerned, besides others, with optimizing algorithms so that their time and/or space complexity is as low as possible which gives rise to algorithm design methods such as dynamic programming (practical optimization is a more pragmatic approach to making algorithms more efficient). Formal verification is a field that tries to mathematically (and sometimes automatically) prove correctness of algorithms (this is needed for critical software, e.g. in planes or medicine). Genetic programming and some other methods of artificial intelligence try to automatically create algorithms (algorithms that create algorithms). Quantum computing is concerned with creating new kinds of algorithms for quantum computers (a new type of still-in-research computers). Programming language design is the art and science of creating languages that express computer algorithms well.

Specific Algorithms

Following are some common algorithms classified into groups.

See Also


aliasing

Aliasing

Aliasing is a certain typically undesirable phenomenon that distorts signals (such as sounds or images) when they are sampled discretely (captured at single points, usually at periodic intervals) -- this can happen e.g. when capturing sound with digital recorders or when rendering computer graphics. There exist antialiasing methods for suppressing or even eliminating aliasing. Aliasing can be often seen on small checkerboard patterns as a moiré pattern (spatial aliasing), or maybe more famously on rotating wheels or helicopter rotor blades that in a video look like standing still or rotating the other way (temporal aliasing, caused by capturing images at intervals given by the camera's FPS).

A simple example showing how sampling at discrete points can quite dramatically alter the recorded result:

|      .|-'-'.' '           .--.           |   |''''
|  . '  |      ' .        .'    '.         |   |    
'|- - -O+ - -O- - .|     |  O  O  |        '---+---.
 |     \|_ _ /    ||     |  \__/  |            |   |
_ _'_._ |      . '|       '.    .'         ____|   |
       ' ' ' '              ''''
  original image    taking even columns  taking odd columns

The following diagram shows the principle of aliasing with a mathematical function:

^       original                     sampling period                       
|   |               |               |<------------->|
|   |             _ |           _   |         _     |
| .'|'.         .' '|         .' '. |       .' '.   |
|/__|__\_______/____|\_______/_____\|______/_____\__|___
|   |   \     /     | \     /       \     /       \ |
|   |    '._.'      |  '._.'        |'._.'         '|_.'
|   |               |               |               |
|   :               :               :               :
V   :               :               :               :
    :               :               :               :
^   :               :               :               :
|   :               :               :               :
|---o---...____     :               :               :
|   |          '''''o...____        :               :
|___|_______________|______ ''''----o_______________:___
|                                     '''----___    |        
|                                               ''''o---
|     reconstructed                                              
|
V

The top signal is a sine function of a certain frequency. We are sampling this signal at periodic intervals indicated by the vertical lines (this is how e.g. digital sound recorders record sounds from the real world). Below we see that the samples we've taken make it seem as if the original signal was a sine wave of a much lower frequency. It is in fact impossible to tell from the recorded samples what the original signal looked like.

Let's note that signals can also be two and more dimensional, e.g. images can be viewed as 2D signals. These are of course affected by aliasing as well.

The explanation above shows why a helicopter's rotating blades look to stand still in a video whose FPS is synchronized with the rotation -- at any moment the camera captures a frame (i.e. takes a sample), the blades are in the same position as before, hence they appear to not be moving in the video.

Of course this doesn't only happen with perfect sine waves. Fourier transform shows that any signal can be represented as a sum of different sine waves, so aliasing can appear anywhere.

Nyquist–Shannon sampling theorem says that aliasing can NOT appear if we sample with at least twice as high frequency as that of the highest frequency in the sampled signal. This means that we can eliminate aliasing by using a low pass filter before sampling which will eliminate any frequencies higher than the half of our sampling frequency. This is why audio is normally sampled with the rate of 44100 Hz -- from such samples it is possible to correctly reconstruct frequencies up to about 22000 Hz which is about the upper limit of human hearing.

Aliasing is also a common problem in computer graphics. For example when rendering textured 3D models, aliasing can appear in the texture if that texture is rendered at a smaller size than its resolution (when the texture is enlarged by rendering, aliasing can't appear because enlargement decreases the frequency of the sampled signal and the sampling theorem won't allow it to happen). (Actually if we don't address aliasing somehow, having lower resolution textures can unironically have beneficial effects on the quality of graphics.) This happens because texture samples are normally taken at single points that are computed by the texturing algorithm. Imagine that the texture consists of high-frequency details such as small checkerboard patterns of black and white pixels; it may happen that when the texture is rendered at lower resolution, the texturing algorithm chooses to render only the black pixels. Then when the model moves a little bit it may happen the algorithm will only choose the white pixels to render. This will result in the model blinking and alternating between being completely black and completely white (while it should rather be rendered as gray).

The same thing may happen in ray tracing if we shoot a single sampling ray for each screen pixel. Note that interpolation/filtering of textures won't fix texture aliasing. What can be used to reduce texture aliasing are e.g. by mipmaps which store the texture along with its lower resolution versions -- during rendering a lower resolution of the texture is chosen if the texture is rendered as a smaller size, so that the sampling theorem is satisfied. However this is still not a silver bullet because the texture may e.g. be shrink in one direction but enlarged in other dimension (this is addressed by anisotropic filtering). However even if we sufficiently suppress aliasing in textures, aliasing can still appear in geometry. This can be reduced by multisampling, e.g. sending multiple rays for each pixel and then averaging their results -- by this we increase our sampling frequency and lower the probability of aliasing.

Why doesn't aliasing happen in our eyes and ears? Because our senses don't sample the world discretely, i.e. in single points -- our senses integrate. E.g. a rod or a cone in our eyes doesn't just see exactly one point in the world but rather an averaged light over a small area (which is ideally right next to another small area seen by another cell, so there is no information to "hide" in between them), and it also doesn't sample the world at specific moments like cameras do, its excitation by light falls off gradually which averages the light over time, preventing temporal aliasing (instead of aliasing we get motion blur).

So all in all, how to prevent aliasing? As said above, we always try to satisfy the sampling theorem, i.e. make our sampling frequency at least twice as high as the highest frequency in the signal we're sampling, or at least get close to this situation and lower the probability of aliasing. This can be done by either increasing sampling frequency (which can be done smart, some methods try to detect where sampling should be denser), or by preprocessing the input signal with a low pass filter or otherwise ensure there won't be too high frequencies (e.g. using lower resolution textures).


altruism

Altruism

Not to be confused with autism.


anal_bead

Anal Bead

To most people anal beads are just sex toys they stick in their butts, however anal beads with with remotely controlled vibration can also serve as a well hideen one-way communication device. Use of an anal bead for cheating in chess has been the topic of a great cheat scandal in 2022 (Niemann vs Carlsen).


analog

Analog

Analog is the opposite of digital.


analytic_geometry

Analytic Geometry

Analytic geometry is part of mathematics that solves geometric problems with algebra; for example instead of finding an intersection of a line and a circle with ruler and compass, analytic geometry finds the intersection by solving an equation. In other words, instead of using pen and paper we use numbers. This is very important in computing as computers of course just work with numbers and aren't normally capable of drawing literal pictures and drawing results from them -- that would be laughable (or awesome?). Analytic geometry finds use especially in such fields as physics simulations (collision detections) and computer graphics, in methods such as raytracing where we need to compute intersections of rays with various mathematically defined shapes in order to render 3D images. Of course the methods are used in other fields, for example rocket science and many other physics areas. Analytic geometry reflects the fact that geometric and algebraic problem are often analogous, i.e. it is also the case that many times problems we encounter in arithmetic can be seen as geometric problems and vice versa (i.e. solving an equation is the same as e.g. finding an intersection of some N-dimensional shapes).

Fun fact: approaches in the opposite direction also exist, i.e. solving mathematical problems physically rather than by computation. For example back in the day when there weren't any computers to compute very difficult integrals and computing them by hand would be immensely hard, people literally cut physical function plots out of paper and weighted them in order to find the integral. Awesome oldschool hacking.

Anyway, how does it work? Typically we work in a 2D or 3D Euclidean space with Cartesian coordinates (but of course we can generalize to more dimensions etc.). Here, geometric shapes can be described with equations (or inequalities); for example a zero-centered circle in 2D with radius r has the equation x^2 + y^2 = r^2 (Pythagorean theorem). This means that the circle is a set of all points [x,y] such that when substituted to the equation, the equation holds. Other shapes such as lines, planes, ellipses, parabolas have similar equations. Now if we want to find intersections/unions/etc., we just solve systems of multiple equations/inequalities and find solutions (coordinates) that satisfy all equations/inequalities at once. This allows us to do basically anything we could do with pen and paper such as defining helper shapes and so on. Using these tools we can compute things such as angles, distances, areas, collision points and much more.

Analytic geometry is closely related to linear algebra.

Examples

Nub example:

Find the intersection of two lines in 2D: one is a horizontal line with y position 2, the other is a 45 degree line going through the [0,0] point in the positive x and positive y direction, like this:

  y
  :        _/ line 2
  :      _/
_2:_____/_______ line 1
  :  _/
  :_/
--:----------x
_/:
  :

The equation of line 1 is just y = 2 (it consists of all points [x,2] where for x we can plug in any number to get a valid point on the line).

The equation of line 2 is x = y (all points that have the same x and y coordinate lie on this line).

We find the intersection by finding such point [x,y] that satisfies both equations. We can do this by plugging the first equation, y = 2, to the second equation, x = y, to get the x coordinate of the intersection: x = 2. By plugging this x coordinate to any of the two line equations we also get the y coordinate: 2. I.e. the intersection lies at coordinates [2,2].

Advanced nub example:

Let's say we want to find, in 2D, where a line L intersects a circle C. L goes through points A = [-3,0.5] and B = [3,2]. C has center at [0,0] and radius r = 2.

The equation for the circle C is x^2 + y^2 = 2^2, i.e. x^2 + y^2 = 4. This is derived from Pythagorean theorem, you can either check that or, if lazy, just trust this. Equations for common shapes can be looked up.

One possible form of an equation of a 2D line is a "slope + offset" equation: y = k * x + q, where k is the tangent (slope) of the line and q is an offset. To find the specific equation for our line L we need to first find the numbers k and q. This is done as follows.

The tangent (slope) k is (B.y - A.y) / (B.x - A.x). This is the definition of a tangent, see that if you don't understand this. So for us k = (2 - 0.5) / (3 - -3) = 0.25.

The number q (offset) is computed by simply substituting some point that lies on the line to the equation and solving for q. We can substitute either A or B, it doesn't matter. Let's go with A: A.y = k * A.x + q, with specific numbers this is 0.5 = 0.25 * -3 + q from which we derive that q = 1.25.

Now we have computed both k and q, so we now have equations for both of our shapes:

Feel free to check the equations, substitute a few points and plot them to see they really represent the shapes (e.g. if you substitute a specific x shape to the line equation you will get a specific y for it).

Now to find the intersections we have to solve the above system of equations, i.e. find such couples (coordinates) [x,y] that will satisfy both equations at once. One way to do this is to substitute the line equation into the circle equation. By this we get:

x^2 + (0.25 * x + 1.25)^2 = 4

This is a quadratic equation, let's get it into the standard format so that we can solve it:

x^2 + 0.0625 * x^2 + 0.625 * x + 1.5625 = 4

1.0625 * x^2 + 0.625 * x - 2.4375 = 0

Note that this makes perfect sense: a quadratic equation can have either one, two or no solution (in the realm of real numbers), just as there can either be one, two or no intersection of a line and a circle.

Solving quadratic equation is simple so we skip the details. Here we get two solutions: x1 = 1.24881 and x2 = -1.83704. These are the x position of our intersections. We can further find also the y coordinates by simply substituting these into the line equation, i.e. we get the final result:

See Also


anarchism

Anarchism

Anarchism (from Greek an, no and archos, ruler) is a socialist political philosophy rejecting any social hierarchy and oppression. Anarchism doesn't mean without rules, but without rulers; despite popular misconceptions anarchism is not chaos -- on the contrary, it strives for a stable, ideal society of equal people that live in peace. It means order without power. The symbols of anarchism include the letter A in a circle and a black flag that for different branches of anarchism is diagonally split from bottom left to top right and the top part is filled with a color specific for that branch.

A great many things about anarchism are explained in the text An Anarchist FAQ, which is free licensed and can be accessed e.g. at https://theanarchistlibrary.org/library/the-anarchist-faq-editorial-collective-an-anarchist-faq-full.

Anarchism is a wide term and encompasses many flavors such as anarcho communism, anarcho pacifism, anarcho syndicalism, anarcho primitivism or anarcho mutualism. Some of the branches disagree on specific questions, e.g. about whether violence is ever justifiable, or propose different solutions to issues such as organization of society, however all branches of anarchism are socialist and all aim for elimination of social hierarchy such as social classes created by wealth, jobs and weapons, i.e. anarchism opposes state (e.g. police having power over citizens) and capitalism (employers exploiting employees, corporations exploiting consumers etc.).

There exist fake, pseudoanarchist ideologies such as "anarcho" capitalism (which includes e.g. so caleed crypto "anarchism") that deceive by their name despite by their very definition NOT fitting the definition of anarchism (just like Nazis called themselves socialists despite being the opposite). Also such shit as "anarcha" feminism are just fascist bullshit. The propaganda also tries to deceive the public by calling various violent criminals anarchists, even though they very often can't fit the definition of a true anarchist.

LRS is an anarchist movement, specifically anarcho pacifist and anarcho communist one.


anarch

Anarch

Anarch is a LRS/suckless, free as in freedom first person shooter game similar to Doom, written by drummyfish. It has been designed to follow the LRS principles very closely and set an example of how games, and software in general, should be written. It also tries to be compatible with principles of less retarded society, i.e. it promotes anarchism, anti-capitalism, pacifism etc.

The repo is available at https://codeberg.org/drummyfish/Anarch or https://gitlab.com/drummyfish/anarch. Some info about the game can also be found at the libregamewiki: https://libregamewiki.org/Anarch.

{ Though retrospectively I can of course see many mistakes and errors about the game, I am overall quite happy about how it turned out, it got some attention among the niche of suckless lovers and many people have written me they liked the game and its philosophy. Many people have ported it to their favorite platforms, some have even written me their own expansion of the game lore, tricks they found etc. If you're among them, thank you :) ~drummyfish }

h@\hMh::@@hhh\h@rrrr//rrrrrrrrrrrrrrrrrrrr@@@@hMM@@@M@:@hhnhhMnr=\@hn@n@h@-::\:h
hMhh@@\\@@@\\h:M/r/////rrrrrrrrrrrrrrr//r@@@@@MMh@@hhh\\\=rMr=M@hh\hn\:\:h::\@\:
@nh==hhhMM@hrh\M/r/////rrrrrrrrrrrrrrr//@@@@@@hhM@h\MhhhMM\@@@@@M\hh\\\Mhh\\\\hh
:hh=@Mh/;;;@hr:M,///;;/////rrr//rrrrrr//@@@@@@hh\h@@hM:==h\@@::\\\:M\@\h\M:\:=@h
\=MhM@hr  `hMhhM///@@@@@@@@@@@@@@@@@@@//@@@@@@rMM@n\M=:@M\\\\Mh\\\hr\n\--h-::r:r
:Mh@M@@`  `rh@\@///@@@@@@@@@@@@@@@@@@@@@@@@@@@Mr\@@\h@:\h\h@\Mhh@@\M@@@@-n\rn@:h
:MhhMn@//r;;@/hM@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@MhMhh:M@MhMhMh@\\rM/@h@nn=-MrnM@:h
:nhhhhh\\//\::@M@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@rMM@@nh@M\=nh@@@M@..=hM@n@-@@@@@:h
\@\h@@rrrr/rr@=M@@@@@@@@@@@@@@nr@@@@@@@@@@@@@@Mrhn@\M@:nMh\@@@@@...h:::::@---::h
-M\h=h\`   rhM\M@@@@@@@@@@@@@@@=@@@@@@@@@@@@@@MhM@\hh@M@Mhh@-\MMhrr\\\:MMh::\\-\
h@hhh\h`  `rMh\M@@@@@@@@@@@@@@nr;;;;rn@@@@@@@@r@r///=@\@\r\\hM@nrrr@\n\h\M\\\\\:
hn===hhM=;hhhh\MrMnrr=rrr=r@rhhr;.r,/hr=r=r=h=r@=/-;/MhhMr:h\@h=...r\@hMhM:/\h\=
@n==M\h@=;hhh\\Mrr=r=r=rMr=hrMMr;;;,;========MM@r=./;@:MMM\h=r=rM/rh@@@M-n---:-h
:\=hMn@@@=\hhh:M===============;/. ,,==========@r-/--@:@M\\@@@n@Mn:hM@n@-=\hr=-h
\hhnM@=@::@MM/h================;;;;.,======\h==M=/;r,//;;r=r=r=r@\=r=r=r=@rnMn:r
:Mrrr=rr==@rr=rrr=rrr=/=r===r==/:; ..===r\\-h==@r-,;-=r/;/;;;;;;rnrrr=rrr=rrr=r;
rrrrrrrr@=rrrrrrrrrrr//r=r=r=r=r;. ,.r=r\---hr=@r===-r=r=;;;r;;;hh@:;;;;;;;;;;-;
r=rrr=rr\\@rr=rrr=r/;/:rr=rrr=rr;r,..=r\--.-h=r@r----=rrr=rrr--:,;;:,;;;,;;;,;--
rrrr:-@=====:,;,;-/;/:rrrrrrrrr;;....r\--.,\hrrrrrrrrrrrrrrrrrrrrr-----rrrrrrrrr
,;,:,; ;,;;;-;;;,;/:-rrrrrrrrrrrrrrrrr\-.,;\@rrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
;,;,;,.,;,;,;,;,;,:rrrrrrrrrrrrrrrrrr\--.;,\Mrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr
,;,;.-.-.-::-:::--rr/rrr/rrr/rrr/rrr/\-.:;::@rrr/rrr/rrr/rrr/rrr/rrr/rrr/rrr/rrr
-.-.r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/\---;::\@/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/
/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r\-.,;:,:@r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r/r
///////////////////////////////////\::-,;,-:@///////////////////////////////////
;///;///;///;///;///;///;///;///;//,::-:,.,-@///;///;///;///;///;///;///;///;///
//////////////////////////////////\----:-.,-h///////////////////////////////////
,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn
nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn
nn..nnn...nn...nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn...nnnnnnnnnnnnnn
nnn.nnn.n.nn.n.nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn.n.nnnnnnnnnnnnnn
nnn.nnn.n.nn.n.nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn.n.nnnnnnnnnnnnnn
nn...nn...nn...nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn...nnnnnnnnnnnnnn
nnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn

screenshot from the terminal version

Anarch has these features:

Technical Details

Anarch's engine uses raycastlib, a LRS library for advanced 2D ray casting which is often called a "pseudo 3D". This method was used by Wolf3D, but Anarch improves it to allow different levels of floor and ceiling which makes it look a little closer to Doom (which however used a different methods called BSP rendering).

The music in the game is procedurally generated using bytebeat.

All images in the game (textures, sprites, ...) are 32x32 pixels, compressed by using a 16 color subpalette of the main 256 color palette, and are stored in source code itself as simple arrays of bytes -- this eliminates the need for using files and allows the game to run on platforms without a file system.

The game uses a tiny custom-made 4x4 bitmap font to render texts.

Saving/loading is optional, in case a platform doesn't have persistent storage. Without saving all levels are simply available from the start.

In the suckless fashion, mods are recommended to be made and distributed as patches.


ancap

"Anarcho" Capitali$m

Not to be confused with anarchism.

So called "anarcho capitalism" (ancap for short, not to be confused with anpac or any form of anarchism) is probably the worst, most retarded and most dangerous idea in the history of ever, and that is the idea of supporting capitalism absolutely unrestricted by a state or anything else. No one with at least 10 brain cells and/or anyone who has spent at least 3 seconds observing the world could come up with such a stupid, stupid idea. We, of course, completely reject this shit.

It has to be noted that "anarcho capitalism" is not real anarchism, despite its name. Great majority of anarchists strictly reject this ideology as any form of capitalism is completely incompatible with anarchism -- anarchism is defined as opposing any social hierarchy and oppression, while capitalism is almost purely based on many types of hierarchies (internal corporate hierarchies, hierarchies between companies, hierarchies of social classes of different wealth etc.) and oppression (employee by employer, consumer by corporation etc.). Why do they call it anarcho capitalism then? Well, partly because they're stupid and don't know what they're talking about (otherwise they couldn't come up with such an idea in the first place) and secondly, as any capitalists, they want to deceive and ride on the train of the anarchist brand -- this is not new, Nazis also called themselves socialists despite being the complete opposite.

The colors on their flag are black and yellow (this symbolizes shit and piss).

It is kind of another bullshit kind of "anarchism" just like "anarcha feminism" etc.

The Worst Idea In History

As if capitalism wasn't extremely bad already, "anarcho" capitalists want to get rid of the last mechanisms that are supposed to protect the people from corporations -- states. We, as anarchists ourselves, of course see states as eventually harmful, but they cannot go before we get rid of capitalism first. Why? Well, imagine all the bad things corporations would want to do but can't because there are laws preventing them -- in "anarcho" capitalism they can do them.

Firstly this means anything is allowed, any unethical, unfair business practice, including slavery, physical violence, blackmailing, rape, worst psychological torture, nuclear weapons, anything that makes you the winner in the jungle system. Except that this jungle is not like the old, self-regulating jungle in which you could only reach limited power, this jungle offers, through modern technology, potentially limitless power with instant worldwide communication and surveillance technology, with mass production, genetic engineering, AI and weapons capable of destroying the planet.

Secondly the idea of getting rid of a state in capitalism doesn't even make sense because if we get rid of the state, the strongest corporation will become the state, only with the difference that state is at least supposed to work for the people while a corporation is only by its very definition supposed to care solely about its own endless profit on the detriment of people. Therefore if we scratch the state, McDonalds or Coca Cola or Micro$oft -- whoever is the strongest -- hires a literal army and physically destroys all its competition, then starts ruling the world and making its own laws -- laws that only serve the further growth of that corporation such as that everyone is forced to work 16 hour shifts every day until he falls dead. Don't like it? They kill your whole family, no problem. 100% of civilization will experience the worst kind of suffering, maybe except for the CEO of McDonald's, the world corporation, until the planet's environment is destroyed and everyone hopefully dies, as death is what we'll wish for.

All in all, "anarcho" capitalism is advocated mostly by children who don't know a tiny bit about anything, by children who are being brainwashed daily in schools by capitalist propaganda, with no education besides an endless stream of ads from their smartphones, or capability of thinking on their own. However, these children are who will run the world soon. It is sad, it's not really their fault, but through them the system will probably come into existence. Sadly "anarcho" capitalism is already a real danger and a very likely future. It will likely be the beginning of our civilization's greatest agony. We don't know what to do against it other than provide education.

God be with us.


anpac

Anarcho Pacifism

Anarcho pacifism (anpac) is a form of anarchism that completely rejects any violence. Anarcho pacifists argue that since anarchism opposes hierarchy and oppression, we have to reject violence which is a tool of oppression and establishing hierarchy. This would make it the one true purest form of anarchism. Anarcho pacifists use a black and white flag.

Historically anarcho pacifists such as Leo Tolstoy were usually religiously motivated for rejecting violence, however this stance may also come from logic and other than religious beliefs, e.g. the simple belief that violence will only spawn more violence ("eye for an eye will only make the whole world blind"), or pure unconditional love of life.

We, LRS, advocate anarcho pacifism. We see how violence can be a short term solution, even to preventing a harm of many, however from the long term perspective we only see the complete delegitimisation of violence as leading to a truly mature society. We realize a complete, 100% non violent society may be never achieved, but with enough education and work it will be possible to establish a society with absolute minimum of violence, a society in which firstly people grow up in a completely non violent environment so that they never accept violence, and secondly have all needs secured so that they don't even have a reason for using violence. We should at least try to get as close to this ideal as possible.


antialiasing

Antialiasing

Antialiasing (AA) means preventing aliasing, i.e. distortion of signal (images, audio, video, ...) caused by discrete sampling. Most people think antialiasing stands for "smooth edges in video game graphics", however that's a completely inaccurate understanding of antialiasing: yes, one of the most noticeable effects of 3D graphics antialiasing for a common human is that of having smooth edges, but smooth edges are not the primary goal, they are not the only effect and they are not even the most important effect of antialisng. Understanding antialiasing requires understanding what aliasing is, which is not a completely trivial thing to do (it's not the most difficult thing in the world either, but most people are just afraid of mathematics, so they prefer to stick with "antialiasing = smooth edges" simplification).

The basic sum up is following: aliasing is a negative effect which may arise when we try to sample (capture) continuous signals potentially containing high frequencies (the kind of "infinitely complex" data we encounter in real world such as images or sounds) in discrete (non-continuous) ways by capturing the signal values at specific points in time (as opposed to capturing integrals of intervals), i.e. in ways native and natural to computers. Note that the aliasing effect is mathematical and is kind of a "punishment" for our "cheating" which we do by trying to simplify capturing of very complex signals, i.e. aliasing has nothing to do with noise or recording equipment imperfections, and it may occur not only when recording real world data but also when simulating real world, for example during 3D graphics rendering (which simulates capturing real world with a camera). A typical example of such aliasing effect is a video of car wheels rotating very fast (with high frequency) with a relatively low FPS camera, which then seem to be rotating very slowly and in opposite direction -- a high frequency signal (fast rotating wheels) caused a distortion (illusion of wheels rotating slowly in opposite direction) due to simplified discrete sampling (recording video as a series of photographs taken at specific points in time in relatively low FPS). Similar undesirable effects may appear e.g. on high resolution textures when they're scaled down on a computer screen (so called Moiré effect), but also in sound or any other data. Antialiasing exploits the mathematical Nyquist–Shannon sampling theorem that says that aliasing cannot occur when the sampling frequency is high enough relatively to the highest frequency in the sampled data, i.e. antialising tries to prevent aliasing effects typically by either preventing high frequency from appearing in the sampled data (e.g. blurring textures, see MIP mapping) or by increasing the sampling frequency (e.g. multisampling). As a side effect of better sampling we also get things such as smoothly rendered edges etc.

Note that the word anti in antialising means that some methods may not prevent aliasing completely, they may just try to suppress it somehow. For example the FXAA (fast approximate antialiasing) method is a postprocessing algorithm which takes an already rendered image and tries to make it as if it was properly rendered in ways preventing aliasing, however it cannot be 100% successful as it doesn't know the original signal, all it can do is try to give us a good enough approximation.

How to do antialiasing? There are many ways, depending on the kind of data (e.g. the number of dimensions of the signal or what frequencies you expect in it) or required quality (whether you want to prevent aliasing completely or just suppress it). As stated above, most methods make use of the Nyquist–Shannon sampling theorem which states that aliasing cannot occur if the sampling frequency is at least twice as high as the highest frequency in the sampled signal. I.e. if you can make sure your sampling frequency is high enough relatively to the highest frequency in the signal, you will completely prevent aliasing -- you can do this by either processing the input signal with a low pass filter (e.g. blurring an image) or by increasing your sampling frequency (e.g. rendering at higher resolution). Some specific antialiasing methods include:


antivirus_paradox

Antivirus Paradox

{ I think this paradox must have had another established name even before antiviruses, but I wasn't able to find anything. If you know it, let me know. ~drummyfish }

Antivirus paradox is the paradox of someone who's job it is to eliminate certain undesirable phenomenon actually having an interest in keeping this phenomenon existing so as to keep his job. A typical example is an antivirus company having an interest in the existence of dangerous viruses and malware so as to keep their business running; in fact antivirus companies themselves secretly create and release viruses and malware.

Cases of this behavior are common, e.g. the bind-torture-kill serial killer used to work as a seller of home security alarms who installed alarms for people who were afraid about being invaded by the bind-torture-killer, and then used his knowledge of the alarms to break into the houses -- a typical capitalist business. It is also a known phenomenon that many firefighters are passionate arsonists because society simply rewards them for fighting fires (as opposed to rewarding them for the lack of fires).

In capitalism and similar systems requiring people to have jobs this paradox prevents progress, i.e. actual elimination of undesirable phenomena, hence capitalism and similar systems are anti-progress. And not only that, the system pressures people to artificially creating new undesirable phenomena (e.g. lack of women in tech and similar bullshit) just to create new bullshit jobs that "fight" this phenomena. In a truly good society where people are not required to have jobs and in which people aim to eliminate work this paradox largely disappears.


apple

Apple

See also http://techrights.org/wiki/Apple%27s_Dark_Side.

Apple is a terrorist organization and one of the biggest American computer fashion corporations, infamously founded by Steve Job$, it creates and sells overpriced, abusive, highly consumerist proprietary electronic devices.


app

App

App is a retarded capitalist name for application; it is used by soydevs, corporations and normalfaggots (similarly to how "coding" is used for programming). This word is absolutely unacceptable and is only to be used to mock these retards.

Anything called an "app" is expected to be bloat, badly designed and, at best, of low quality (and, at worst, malicious).


approximation

Approximation

Approximating means calculating or representing something with lesser than best possible precision -- estimating -- purposefully allowing some margin of error in results and using simpler mathematical models than the most accurate ones: this is typically done in order to save resources (CPU cycles, memory etc.) and reduce complexity so that our projects and analysis stay manageable. Simulating real world on a computer is always an approximation as we cannot capture the infinitely complex and fine nature of the real world with a machine of limited resources, but even withing this we need to consider how much, in what ways and where to simplify.

Using approximations however doesn't have to imply decrease in precision of the final result -- approximations very well serve optimization. E.g. approximate metrics help in heuristic algorithms such as A*. Another use of approximations in optimization is as a quick preliminary check for the expensive precise algorithms: e.g. using bounding spheres helps speed up collision detection (if bounding spheres of two objects don't collide, we know they can't possibly collide and don't have to expensively check this).

Examples of approximations:


arch

Arch Linux

"BTW I use Arch"

Arch Linux is a rolling-release Linux distribution for the "tech-savvy", mostly fedora-wearing weirdos.

Arch is shit at least for two reasons: it has proprietary packages (such as discord) and it uses systemd. Artix Linux is a fork of Arch without systemd.


art

Art

Art is an endeavor that seeks discovery and creation of beauty and primarily relies on intuition. While the most immediate examples of art that come to mind are for example music and painting, even the most scientific and rigorous effort like math and programming becomes art when pushed to the highest level, to the boundaries of current knowledge where intuition becomes important for further development.

See Also


ascii_art

ASCII Art

ASCII art is the art of manually creating graphics and images only out of fixed-width ASCII characters. This means no unicode or extended ASCII characters are allowed, of course. ASCII art is also, strictly speaking, separate from mere ASCII rendering, i.e. automatically rendering a bitmap image with ASCII characters in place of pixels, and ASCII graphics that utilizes the same techniques as ASCII art but can't really be called art (e.g. computer generated diagrams). Pure ASCII art should make no use of color.

This kind of art used to be a great part of the culture of earliest Internet communities for a number of reasons imposed largely by the limitations of old computers -- it could be created easily with a text editor and saved in pure text format, it didn't take much space to store or send over a network and it could be displayed on text-only displays and terminals. The principle itself predates computers, people were already making this kind of images with type writers. Nevertheless the art survives even to present day and lives on in the hacker culture, in Unix communities, on the Smol Internet etc. ASCII diagram may very well be embedded e.g. in a comment in a source code to explain some spatial concept -- that's pretty KISS. We, LRS, highly advocate use of ASCII art whenever it's good enough.

Here is a simple 16-shade ASCII palette (but watch out, whether it works will depend on your font): #OVaxsflc/!;,.- . Another one can be e.g.: WM0KXkxocl;:,'. .

            _,,_  
           /    ';_  
    .     (  0 _/  "-._
    |\     \_ /_==-"""'
    | |:---'   (
     \ \__."    ) Steamer
      '--_ __--'    Duck!
          |L_
          

      []  [][][][][]
      [][][]      [][]
      [][]          []
      []    XX    XX[]
      []      XXXX  []
      [][]          []
      [][][]      [][]
      []  [][][][][]
        
          SAF FTW
          
^
|
|   _.--._                  _.--._
| .'      '.              .'      '.
|/__________\____________/__________\______
|            \          /            \
|             '.      .'              '.
|               `'--'`                  `'-
|
V

   ("\/") ("\/") ("\/")
   \    / \    / \    /
    \  /   \  /   \  /
     \/     \/     \/

See Also


ascii

ASCII

ASCII (American standard code for information interchange) is a relatively simple standard for digital encoding of text that's one of the most basic and probably the most common format used for this purpose. For its simplicity and inability to represent characters of less common alphabets it is nowadays quite often replaced with more complex encodings such as UTF-8 who are however almost always backwards compatible with ASCII (interpreting UTF-8 as ASCII will give somewhat workable results), and ASCII itself is also normally supported everywhere. ASCII is the suckless/LRS/KISS character encoding, recommended and good enough for most programs.

The ASCII standard assigns a 7 bit code to each basic text character which gives it a room for 128 characters -- these include lowercase and uppercase English alphabet, decimal digits, other symbols such as a question mark, comma or brackets, plus a few special control characters that represent instructions such as carriage return which are however often obsolete nowadays. Due to most computers working with 8 bit bytes, most platforms store ASCII text with 1 byte per character; the extra bit creates a room for extending ASCII by another 128 characters (or creating a variable width encoding such as UTF-8). These extensions include unofficial ones such as VISCII (ASCII with additional Vietnamese characters) and more official ones, most notably ISO 8859: a group of standards by ISO for various languages, e.g. ISO 88592-1 for western European languages, ISO 8859-5 for Cyrillic languages etc.

The ordering of characters has been kind of cleverly designed to make working with the encoding easier, for example digits start with 011 and the rest of the bits correspond to the digit itself (0000 is 0, 0001 is 1 etc.). Corresponding upper and lower case letters only differ in the 6th bit, so you can easily convert between upper and lower case by negating it as letter ^ 0x20. { I think there is a few missed opportunities though, e.g. in not putting digits right before letters. That way it would be very easy to print hexadecimal (and all bases up to a lot) simply as putchar('0' + x). ~drummyfish }

ASCII was approved as an ANSI standard in 1963 and since then underwent many revisions every few years. The current one is summed up by the following table:

dec hex oct bin symbol
000 00 000 0000000 NUL: null
001 01 001 0000001 SOH: start of heading
002 02 002 0000010 STX: start of text
003 03 003 0000011 ETX: end of text
004 04 004 0000100 EOT: end of stream
005 05 005 0000101 ENQ: enquiry
006 06 006 0000110 ACK: acknowledge
007 07 007 0000111 BEL: bell
008 08 010 0001000 BS: backspace
009 09 011 0001001 TAB: tab (horizontal)
010 0a 012 0001010 LF: new line
011 0b 013 0001011 VT: tab (vertical)
012 0c 014 0001100 FF: new page
013 0d 015 0001101 CR: carriage return
014 0e 016 0001110 SO: shift out
015 0f 017 0001111 SI: shift in
016 10 020 0010000 DLE: data link escape
017 11 021 0010001 DC1: device control 1
018 12 022 0010010 DC2: device control 2
019 13 023 0010011 DC3: device control 3
020 14 024 0010100 DC4: device control 4
021 15 025 0010101 NAK: not acknowledge
022 16 026 0010110 SYN: sync idle
023 17 027 0010111 ETB: end of block
024 18 030 0011000 CAN: cancel
025 19 031 0011001 EM: end of medium
026 1a 032 0011010 SUB: substitute
027 1b 033 0011011 ESC: escape
028 1c 034 0011100 FS: file separator
029 1d 035 0011101 GS: group separator
030 1e 036 0011110 RS: record separator
031 1f 037 0011111 US: unit separator
032 20 040 0100000 : space
033 21 041 0100001 !
034 22 042 0100010 "
035 23 043 0100011 #
036 24 044 0100100 $
037 25 045 0100101 %
038 26 046 0100110 &
039 27 047 0100111 '
040 28 050 0101000 (
041 29 051 0101001 )
042 2a 052 0101010 *
043 2b 053 0101011 +
044 2c 054 0101100 ,
045 2d 055 0101101 -
046 2e 056 0101110 .
047 2f 057 0101111 /
048 30 060 0110000 0
049 31 061 0110001 1
050 32 062 0110010 2
051 33 063 0110011 3
052 34 064 0110100 4
053 35 065 0110101 5
054 36 066 0110110 6
055 37 067 0110111 7
056 38 070 0111000 8
057 39 071 0111001 9
058 3a 072 0111010 :
059 3b 073 0111011 ;
060 3c 074 0111100 <
061 3d 075 0111101 =
062 3e 076 0111110 >
063 3f 077 0111111 ?
064 40 100 1000000 @
065 41 101 1000001 A
066 42 102 1000010 B
067 43 103 1000011 C
068 44 104 1000100 D
069 45 105 1000101 E
070 46 106 1000110 F
071 47 107 1000111 G
072 48 110 1001000 H
073 49 111 1001001 I
074 4a 112 1001010 J
075 4b 113 1001011 K
076 4c 114 1001100 L
077 4d 115 1001101 M
078 4e 116 1001110 N
079 4f 117 1001111 O
080 50 120 1010000 P
081 51 121 1010001 Q
082 52 122 1010010 R
083 53 123 1010011 S
084 54 124 1010100 T
085 55 125 1010101 U
086 56 126 1010110 V
087 57 127 1010111 W
088 58 130 1011000 X
089 59 131 1011001 Y
090 5a 132 1011010 Z
091 5b 133 1011011 [
092 5c 134 1011100 \
093 5d 135 1011101 ]
094 5e 136 1011110 ^
095 5f 137 1011111 _
096 60 140 1100000 `: backtick
097 61 141 1100001 a
098 62 142 1100010 b
099 63 143 1100011 c
100 64 144 1100100 d
101 65 145 1100101 e
102 66 146 1100110 f
103 67 147 1100111 g
104 68 150 1101000 h
105 69 151 1101001 i
106 6a 152 1101010 j
107 6b 153 1101011 k
108 6c 154 1101100 l
109 6d 155 1101101 m
110 6e 156 1101110 n
111 6f 157 1101111 o
112 70 160 1110000 p
113 71 161 1110001 q
114 72 162 1110010 r
115 73 163 1110011 s
116 74 164 1110100 t
117 75 165 1110101 u
118 76 166 1110110 v
119 77 167 1110111 w
120 78 170 1111000 x
121 79 171 1111001 y
122 7a 172 1111010 z
123 7b 173 1111011 {
124 7c 174 1111100 `
125 7d 175 1111101 }
126 7e 176 1111110 ~
127 7f 177 1111111 DEL

See Also


assembly

Assembly

Assembly (also ASM) is, for any given hardware computing platform (ISA, basically a CPU architecture), the lowest level programming language that expresses typically a linear, unstructured sequence of CPU instructions -- it maps (mostly) 1:1 to machine code (the actual binary CPU instructions) and basically only differs from the actual machine code by utilizing a more human readable form (it gives human friendly nicknames, or mnemonics, to different combinations of 1s and 0s). Assembly is converted by assembler into the the machine code. Assembly is similar to bytecode, but bytecode is meant to be interpreted or used as an intermediate representation in compilers while assembly represents actual native code run by hardware. In ancient times when there were no higher level languages (like C or Fortran) assembly was used to write computer programs -- nowadays most programmers no longer write in assembly (majority of zoomer "coders" probably never even touch anything close to it) because it's hard (takes a long time) and not portable, however programs written in assembly are known to be extremely fast as the programmer has absolute control over every single instruction (of course that is not to say you can't fuck up and write a slow program in assembly).

Assembly is NOT a single language, it differs for every architecture, i.e. every model of CPU has potentially different architecture, understands a different machine code and hence has a different assembly (though there are some standardized families of assembly like x86 that work on wide range of CPUs); therefore assembly is not portable (i.e. the program won't generally work on a different type of CPU or under a different OS)! And even the same kind of assembly language may have several different syntax formats which may differ in comment style, order of writing arguments and even instruction abbreviations (e.g. x86 can be written in Intel and AT&T syntax). For the reason of non-portability (and also for the fact that "assembly is hard") you shouldn't write your programs directly in assembly but rather in a bit higher level language such as C (which can be compiled to any CPU's assembly). However you should know at least the very basics of programming in assembly as a good programmer will come in contact with it sometimes, for example during hardcore optimization (many languages offer an option to embed inline assembly in specific places), debugging, reverse engineering, when writing a C compiler for a completely new platform or even when designing one's own new platform. You should write at least one program in assembly -- it gives you a great insight into how a computer actually works and you'll get a better idea of how your high level programs translate to machine code (which may help you write better optimized code) and WHY your high level language looks the way it does.

The most common assembly languages you'll encounter nowadays are x86 (used by most desktop CPUs) and ARM (used by most mobile CPUs) -- both are used by proprietary hardware and though an assembly language itself cannot (as of yet) be copyrighted, the associated architectures may be "protected" (restricted) e.g. by patents. RISC-V on the other hand is an "open" alternative, though not yet so wide spread. Other assembly languages include e.g. AVR (8bit CPUs used e.g. by some Arduinos) and PowerPC.

To be precise, a typical assembly language is actually more than a set of nicknames for machine code instructions, it may offer helpers such as macros (something aking the C preprocessor), pseudoinstructions (commands that look like instructions but actually translate to e.g. multiple instructions), comments, directives, named labels for jumps (as writing literal jump addresses would be extremely tedious) etc.

Assembly is extremely low level, so you get no handholding or much programming "safety" (apart from e.g. CPU operation modes), you have to do everything yourself -- you'll be dealing with things such as function call conventions, interrupts, syscalls and their conventions, how many CPU cycles different instructions take, memory segments, endianness, raw addresses/goto jumps, call frames etc.

Note that just replacing assembly mnemonics with binary machine code instructions is not yet enough to make an executable program! More things have to be done such as linking libraries and converting the result to some executable format such as elf which contains things like header with metainformation about the program etc.

Typical Assembly Language

Assembly languages are usually unstructured, i.e. there are no control structures such as if or while statements: these have to be manually implemented using labels and jump (goto) instructions. There may exist macros that mimic control structures. The typical look of an assembly program is however still a single column of instructions with arguments, one per line, each representing one machine instruction.

The working of the language reflects the actual hardware architecture -- most architectures are based on registers so usually there is a small number (something like 16) of registers which may be called something like R0 to R15, or A, B, C etc. Sometimes registers may even be subdivided (e.g. in x86 there is an eax 32bit register and half of it can be used as the ax 16bit register). These registers are the fastest available memory (faster than the main RAM memory) and are used to perform calculations. Some registers are general purpose and some are special: typically there will be e.g. the FLAGS register which holds various 1bit results of performed operations (e.g. overflow, zero result etc.). Some instructions may only work with some registers (e.g. there may be kind of a "pointer" register used to hold addresses along with instructions that work with this register, which is meant to implement arrays). Values can be moved between registers and the main memory (with instructions called something like move, load or store).

Writing instructions works similarly to how you call a function in high level language: you write its name and then its arguments, but in assembly things are more complicated because an instruction may for example only allow certain kinds of arguments -- it may e.g. allow a register and immediate constant (kind of a number literal/constant), but not e.g. two registers. You have to read the documentation for each instruction. While in high level language you may write general expressions as arguments (like myFunc(x + 2 * y,myFunc2())), here you can only pass specific values.

There are also no complex data types, assembly only works with numbers of different size, e.g. 16 bit integer, 32 bit integer etc. Strings are just sequences of numbers representing ASCII values, it is up to you whether you implement null terminated strings or Pascal style strings. Pointers are just numbers representing addresses. It is up to you whether you interpret a number as signed or unsigned (some instructions treat numbers as unsigned, some as signed).

Instructions are typically written as three-letter abbreviations and follow some unwritten naming conventions so that different assembly languages at least look similar. Common instructions found in most assembly languages are for example:

How To

On Unices the objdump utility from GNU binutils can be used to disassemble compiled programs, i.e view the instructions of the program in assembly (other tools like ndisasm can also be used). Use it e.g. as:

objdump -d my_compiled_program

Let's now write a simple Unix program in x86 assembly (AT&T syntax). Write the following source code into a file named e.g. program.s:

.global   _start         # include the symbol in object file

str:
.ascii    "it works\n"   # the string data

.text 
_start:                  # execution starts here
  mov     $5,   %rbx     # store loop counter in rbx

.loop:
  # make a Linux "write" syscall:
                         # args to syscall will be passed in regs.
  mov     $1,   %rax     # says syscalls type (1 = write)
  mov     $1,   %rdi     # says file to write to (1 = stdout)
  mov     $str, %rsi     # says the address of the string to write
  mov     $9,   %rdx     # says how many bytes to write
  syscall                # makes the syscall

  sub     $1,   %rbx     # decrement loop counter
  cmp     $0,   %rbx     # compare it to 0
  jne     .loop          # if not equal, jump to start of the loop

  # make an "exit" syscall to properly terminate:
  mov     $60,  %rax     # says syscall type (60 = exit)
  mov     $0,   %rdi     # says return value (0 = success)
  syscall                # makes the syscall

The program just writes out it works five times: it uses a simple loop and a Unix system call for writing a string to standard output (i.e. it won't work on Windows and similar shit).

Now assembly source code can be manually assembled into executable by running assemblers like as or nasm to obtain the intermediate object file and then linking it with ld, but to assemble the above written code simply we may just use the gcc compiler which does everything for us:

gcc -nostdlib -no-pie -o program program.s

Now we can run the program with

./program

And we should see

it works
it works
it works
it works
it works

As an exercise you can objdump the final executable and see that the output basically matches the original source code. Furthermore try to disassemble some primitive C programs and see how a compiler e.g. makes if statements or functions into assembly.

Example

Let's take the following C code:

#include <stdio.h>

char incrementDigit(char d)
{
  return // remember this is basically an if statement
    d >= '0' && d < '9' ?
    d + 1 :
    '?';
}

int main(void)
{
  char c = getchar();
  putchar(incrementDigit(c));
  return 0;
}

We will now compile it to different assembly languages (you can do this e.g. with gcc -S my_program.c). This assembly will be pretty long as it will contain boilerplate and implementations of getchar and putchar from standard library, but we'll only be looking at the assembly corresponding to the above written code. Also note that the generated assembly will probably differ between compilers, their versions, flags such as optimization level etc. The code will be manually commented.

{ I used this online tool: https://godbolt.org. ~drummyfish }

The x86 assembly may look like this:

incrementDigit:
  pushq   %rbp                   # save base pointer
  movq    %rsp, %rbp             # move base pointer to stack top
  movl    %edi, %eax             # move argument to eax
  movb    %al, -4(%rbp)          # and move it to local var.
  cmpb    $47, -4(%rbp)          # compare it to '0'
  jle     .L2                    # if <=, jump to .L2
  cmpb    $56, -4(%rbp)          # else compare to '9'
  jg      .L2                    # if >, jump to .L4
  movzbl  -4(%rbp), %eax         # else get the argument
  addl    $1, %eax               # add 1 to it
  jmp     .L4                    # jump to .L4
.L2:
  movl    $63, %eax              # move '?' to eax (return val.)
.L4:
  popq    %rbp                   # restore base pointer
  ret
  
main:
  pushq   %rbp                   # save base pointer
  movq    %rsp, %rbp             # move base pointer to stack top
  subq    $16, %rsp              # make space on stack
  call    getchar                # push ret. addr. and jump to func.
  movb    %al, -1(%rbp)          # store return val. to local var.
  movsbl  -1(%rbp), %eax         # move with sign extension
  movl    %eax, %edi             # arg. will be passed in edi
  call    incrementDigit
  movsbl  %al, %eax              # sign extend return val.
  movl    %eax, %edi             # pass arg. in edi again
  call    putchar
  movl    $0, %eax               # values are returned in eax
  leave
  ret

The ARM assembly may look like this:

incrementDigit:
  sub   sp, sp, #16              // make room on stack
  strb  w0, [sp, 15]             // load argument from w0 to local var.
  ldrb  w0, [sp, 15]             // load back to w0
  cmp   w0, 47                   // compare to '0'
  bls   .L2                      // branch to .L2 if <
  ldrb  w0, [sp, 15]             // load argument again to w0
  cmp   w0, 56                   // compare to '9'
  bhi   .L2                      // branch to .L2 if >=
  ldrb  w0, [sp, 15]             // load argument again to w0
  add   w0, w0, 1                // add 1 to it
  and   w0, w0, 255              // mask out lowest byte
  b     .L3                      // branch to .L3
.L2:
  mov   w0, 63                   // set w0 (ret. value) to '?'
.L3:
  add   sp, sp, 16               // shift stack pointer back
  ret
  
main:
  stp   x29, x30, [sp, -32]!     // shift stack and store x regs
  mov   x29, sp
  bl    getchar
  strb  w0, [sp, 31]             // store w0 (ret. val.) to local var. 
  ldrb  w0, [sp, 31]             // load it back to w0
  bl    incrementDigit
  and   w0, w0, 255              // mask out lowest byte
  bl    putchar
  mov   w0, 0                    // set ret. val. to 0
  ldp   x29, x30, [sp], 32       // restore x regs
  ret

The RISC-V assembly may look like this:

incrementDigit:
  addi    sp,sp,-32              # shift stack (make room)
  sw      s0,28(sp)              # save frame pointer
  addi    s0,sp,32               # shift frame pointer
  mv      a5,a0                  # get arg. from a0 to a5
  sb      a5,-17(s0)             # save to to local var.
  lbu     a4,-17(s0)             # get it to a4
  li      a5,47                  # load '0' to a4
  bleu    a4,a5,.L2              # branch to .L2 if a4 <= a5
  lbu     a4,-17(s0)             # load arg. again
  li      a5,56                  # load '9' to a5
  bgtu    a4,a5,.L2              # branch to .L2 if a4 > a5
  lbu     a5,-17(s0)             # load arg. again
  addi    a5,a5,1                # add 1 to it
  andi    a5,a5,0xff             # mask out the lowest byte
  j       .L3                    # jump to .L3
.L2:
  li      a5,63                  # load '?'
.L3:
  mv      a0,a5                  # move result to ret. val.
  lw      s0,28(sp)              # restore frame pointer
  addi    sp,sp,32               # pop stack
  jr      ra                     # jump to addr in ra
  
main:
  addi    sp,sp,-32              # shift stack (make room)
  sw      ra,28(sp)              # store ret. addr on stack
  sw      s0,24(sp)              # store stack frame pointer on stack
  addi    s0,sp,32               # shift frame pointer
  call    getchar
  mv      a5,a0                  # copy return val. to a5
  sb      a5,-17(s0)             # move a5 to local var
  lbu     a5,-17(s0)             # load it again to a5
  mv      a0,a5                  # move it to a0 (func. arg.)
  call    incrementDigit
  mv      a5,a0                  # copy return val. to a5
  mv      a0,a5                  # get it back to a0 (func. arg.)
  call    putchar
  li      a5,0                   # load 0 to a5
  mv      a0,a5                  # move it to a0 (ret. val.)
  lw      ra,28(sp)              # restore return addr.
  lw      s0,24(sp)              # restore frame pointer
  addi    sp,sp,32               # pop stack
  jr      ra                     # jump to addr in ra

assertiveness

Assertiveness

Assertiveness is an euphemism for being a dick.


atan

Arcus Tangent

Arcus tangent, written as atan or tan^-1, is the inverse function to the tangent function. For given argument x (any real number) it returns a number y (from -pi/2 to pi/2) such that tan(y) = x.

Approximation: Near 0 atan(x) can very rougly be approximated simply by x. For a large argument atan(x) can be approximated by pi/2 - 1/x (as atan's limit is pi/2). The following formula { created by me ~drummyfish } approximates atan with a poylnomial for non-negative argument with error smaller than 2%:

atan(x) ~= (x * (2.96088 + 4.9348 * x))/(3.2 + 3.88496 * x + pi * x^2)

            | y
       pi/2 +                  
            |       _..---''''''
            |   _.''
            | .'
-----------.+'-+--+--+--+--+--> x
        _.' |0 1  2  3  4  5
     _-'    |
.--''       |
      -pi/2 +
            |

plot of atan(x)


atheism

Atheism

"In this moment I am euphoric ..." --some retarded atheist

An atheist is someone who doesn't believe in god or any other similar supernatural beings.

An especially annoying kind is the reddit atheist who will DESTROY YOU WITH FACTS AND LOGIC^(TM). These atheists are 14 year old children who think they've discovered the secret of the universe and have to let the whole world know they're atheists who will destroy you with their 200 IQ logic and knowledge of all 10 cognitive biases and argument fallacies, while in fact they reside at the mount stupid and many times involuntarily appear on other subreddits such as r/iamverysmart and r/cringe. They masturbate to Richard Dawkins, love to read soyentific studiiiiiies about how race has no biological meaning and think that religion is literally Hitler. They like to pick easy targets such as flatearthers and cyberbully them on YouTube with the power of SCIENCE and their enormously large thesaurus (they will never use a word that's among the 100000 most common English words). They are so cringe you want to kill yourself, but their discussions are sometimes entertaining to read with a bowl of popcorn.

Such a specimen of atheist is one of the best quality examples of a pseudosceptic. See also this: https://www.debunkingskeptics.com/Contents.htm.

On a bit more serious note: we've all been there, most people in their teens think they're literal Einsteins and then later in life cringe back on themselves. However, some don't grow out of it and stay arrogant, ignorant fucks for their whole lives. The principal mistake of the stance they retain is they try to apply "science" (or whatever it means in their world) to EVERYTHING and reject any other approach to solving problems -- of course, science (the real one) is great, but it's just a tool, and just like you can't fix every problem with a hammer, you can't approach every problem with science. In your daily life you make a million of unscientific decisions and it would be bad to try to apply science to them; you cross the street not because you've read a peer-reviewed paper about it being the most scientifically correct thing to do, but because you feel like doing it, because you believe the drivers will stop and won't run you over. Beliefs, intuition, emotion, non-rationality and even spirituality are and have to be part of life, and it's extremely stupid to oppose these concepts just out of principle. With that said, there's nothing wrong about being a well behaved man who just doesn't feel a belief in any god in his heart, just you know, don't be an idiot.

Among the greatest minds it is hard to find true atheists, even though they typically have a personal and not easy to describe faith. Newton was a Christian. Einstein often used the word "God" instead of "nature" or "universe"; even though he said he didn't believe in the traditional personal God, he also said that the laws of physics were like books in a library which must have obviously been written by someone or something we can't comprehend. Nikola Tesla said he was "deeply religious, though not in the orthodox sense". There are also very hardcore religious people such as Larry Wall, the inventor of Perl language, who even planned to be a Christian missionary. The "true atheists" are mostly second grade "scientists" who make career out of the pose and make living by writing books about atheism rather than being scientists.

See Also


audiophilia

Audiophilia

Audiophilia is a mental disorder, similar to other diseases such as distrohopping and chronic ricing, that makes one scared of low or normal quality audio. Audiophiles are scared of lossy compression and so harm society by wasting storage space. Audiophilia, similarly to e.g. the business with mechanical keyboards, is the astrology of technology, it is an arbitrarily invented bullshit business creating an artificial need that makes people wanna buy golden cables and similar shit in belief that it will make their life happier, perpetuation consumerism and capitalism.


autoupdate

Autoupdate

Autoupdate is a malicious software feature that frequently remotely modifies software on the user's device without asking, sometimes silently and many times in a forced manner without the possibility to refuse this modification (typically in proprietary software). This is a manifestation of update culture. These remote software modifications are called "updates" to make the user think they are a good thing, but in fact they usually introduce more bugs, bloat, security vulnerabilities, annoyance (forced reboots etc.) and malware (even in "open source", see e.g. the many projects on GitHub that introduced intentional malware targeted at Russian users during the Russia-Ukraine war).


avpd

Avoidant Personality Disorder

TODO

In many cases avoiding the problem really is the objectively best solution.


axiom_of_choice

Axiom Of Choice

In mathematics (specifically set theory) axiom of choice is a possible axiom which basically states we can arbitrarily choose elements of sets and which is famous for being controversial and problematic because it causes trouble both when we accept or reject it. Note that this topic can go to a great depth and lead to philosophical debates, there is a huge rabbit hole and mathematicians can talk about this for hours; here we'll only state the very basic and quite simplified things, mostly for those who aren't professional mathematicians but need some overview of mathematics (e.g. programmers).

Indeed, what really IS the axiom of choice? It is an axiom, i.e. something that we can't prove but can either accept or reject as a basic fact so that we can use it to prove things. Informally it says that given any collection of sets (even an infinite collection of infinitely large sets), we can make an arbitrary selection of one element from each set. More mathematically it says: if we have a collection of sets, there always exists a function f such that for any set S from the collection f(S) is an element of S.

This doesn't sound weird, does it? Well, in many normal situations it isn't. For example if we have finitely many sets, we can simply write out each element of the set, we don't need to define any selection function, so we don't need axiom of choice to make our choice of elements here. But also if we have infinitely many sets that are well ordered (we can compare elements), for example infinitely many sets of natural numbers, we can simply define a function that takes e.g. the smallest number from each set -- here we don't need axiom of choice either. The issues start if we have e.g. infinitely many sets of real numbers (which can't be well ordered without the axiom of choice, consider that e.g. open intervals don't have lowest number) -- here we can't say how a function should select one element from each set, so we have to either accept axiom of choice (we say it simply can be done "somehow", e.g. by writing each element out on an infinitely large paper) or reject it (we say it can't be done). Here it is again the case that what's normally completely non-problematic starts to get very weird once you involve infinity.

Why is it problematic? Once you learn about axiom of choice, your first question will probably be why should it pose any problems if it just seems like an obvious fact. Well, it turns out it leads to strange things. If we accept axiom of choice, then some weird things happen, most famously e.g. the Banach-Tarski paradox which uses the axiom of choice to prove that you can disassemble a sphere into finitely many pieces, then move and rotate them so that they create TWO new spheres, each one identical to the original (i.e. you duplicate the original sphere). But if we reject the axiom of choice, other weird things happen, for example we can't prove that every vector space has a basis -- it seems quite elementary that every vector space should have a basis, but this can't be proven without the axiom of choice and in fact accepting this implies the axiom of choice is true. Besides this great many number of proofs simply don't work without axiom of choice. So essentially either way things get weird, whether we accept axiom of choice or not.

So what do mathematicians do? How do they deal with this and why don't they kill themselves? Well, in reality most of them are pretty chill and don't really care, they try avoid it if they can (their proof is kind of stronger if it relies on fewer axioms) but they accept it if they really need it for a specific proof. Many elementary things in mathematics actually rely on axiom of choice, so there's no fuzz when someone uses it, it's very normal. Turns out axiom of choice is more of something they argue over a beer, they usually disagree about whether it is INTUITIVELY true or false, but that doesn't really affect their work.


backgammon

Backgammon

Backgammon is an old, very popular board game of both skill and chance (dice rolling). It often involves betting and is especially popular in countries of Near East such as Egypt, Syria etc. (where it is kind of what chess is to our western world or what shogi and go are to Asia). Similartly to chess, go, shogi and other traditional board games backgammon is considered by us to be one of the best games as it is owned by no one, highly free, cheap, simple yet deep and entertaining and can be played even without a computer, just with a bunch of rocks; compared to the other mentioned board games backgammon is unique by involving an element of chance and being only played on 1 dimensional board; it is also relatively simple and therefore noob-friendly and possibly more relaxed (if you lose you can just blame it on rolling bad numbers).

TODO


backpropagation

Backpropagation

{ Dunno if this is completely correct, I'm learning this as I'm writing it. There may be errors. ~drummyfish }

Backpropagation, or backprop, is an algorithm, based on the chain rule of derivation, used in training neural networks; it computes the partial derivative (or gradient) of the function of the network's error so that we can perform a gradient descent, i.e. update the weights towards lowering the network's error. It computes the analytical derivative (theoretically you could estimate a derivative numerically, but that's not so accurate and can be too computationally expensive). Backpropagation is one of the most common methods for training neural networks but it is NOT the only possible one -- there are many more such as evolutionary programming. It is called backpropagation because it works backwards and propagates the error from the output towards the input, due to how the chain rule works, and it's efficient by reusing already computed values.

Details

Consider the following neural network:

     w000     w100
  x0------y0------z0
    \    /  \    /  \
     \  /    \  /    \
      \/w010  \/w11O  \_E
      /\w001  /\w1O1  /
     /  \    /  \    /
    /    \  /    \  /
  x1------y1------z1
     w011     w111

It has an input layer (neurons x0, x1), a hidden layer (neurons y0, y1) and an output layer (neurons z0, z1). For simplicity there are no biases (biases can easily be added as input neurons that are always on). At the end there is a total error E computed from the networks's output against the desired output (training data).

Let's say the total error is computed as the squared error: E = squared_error(z0) + squared_error(z1) = 1/2 * (z0 - z0_desired)^2 + 1/2 * (z1 - z1_desired)^2.

We can see each non-input neuron as a function. E.g. the neuron z0 is a function z0(x) = z0(a(z0s(x))) where:

If you don't know what the fuck is going on see neural networks first.

What is our goal now? To find the partial derivative of the whole network's total error function (at the current point defined by the weights), or in other words the gradient at the current point. I.e. from the point of view of the total error (which is just a number output by this system), the network is a function of 8 variables (weights w000, w001, ...) and we want to find a derivative of this function in respect to each of these variables (that's what a partial derivative is) at the current point (i.e. with current values of the weights). This will, for each of these variables, tell us how much (at what rate and in which direction) the total error changes if we change that variable by certain amount. Why do we need to know this? So that we can do a gradient descent, i.e. this information is kind of a direction in which we want to move (change the weights and biases) towards lowering the total error (making the network compute results which are closer to the training data). So all in all the goal is to find derivatives (just numbers, slopes) with respect to w000, w001, w010, ... w111.

Could we do this without backpropagation? Yes -- we can use numerical algorithms to estimate derivatives, the simplest one would be to just try to change each weight, one by one, by some small number, let's say dw, and see how much such change changes the output error. I.e. we would sample the error function in all directions which could give us an idea of the slope in each direction. However this would be pretty slow, we would have to reevaluate the whole neural network as many times as there are weights. Backpropagation can do this much more efficiently.

Backpropagation is based on the chain rule, a rule of derivation that equates the derivative of a function composition (functions inside other functions) to a product of derivatives. This is important because by converting the derivatives to a product we will be able to reuse the individual factors and so compute very efficiently and quickly.

Let's write derivative of f(x) with respect to x as D{f(x),x}. The chain rule says that:

D{f(g(x)),x} = D{f(g(x)),g(x)} * D{g(x),x}

Notice that this can be applied to any number of composed functions, the product chain just becomes longer.

Let's get to the computation. Backpropagation work by going "backwards" from the output towards the input. So, let's start by computing the derivative against the weight w100. It will be a specific number; let's call it 'w100. Derivative of a sum is equal to the sum of derivatives:

'w100 = D{E,w100} = D{squared_error(z0),w100} + D{squared_error(z0),w100} = D{squared_error(z0),w100} + 0

(The second part of this sum became 0 because with respect to w100 it is a constant.)

Now we can continue and utilize the chain rule:

'w100 = D{E,w100} = D{squared_error(z0),w100} = D{squared_error(z0(a(z0s))),w100} = D(squared_error(z0),z0) * D{a(z0s),z0s} * d{z0s,w100}

We'll now skip the intermediate steps, they should be easy if you can do derivatives. The final results is:

'w100 = (z0_desired - z0) * (z0s * (1 - z0s)) * y0

Now we have computed the derivative against w100. In the same way can compute 'w101, 'w110 and 'w111 (weights leading to the output layer).

Now let's compute the derivative in respect to w000, i.e. the number 'w000. We will proceed similarly but the computation will be different because the weight w000 affects both output neurons ('z0' and 'z1'). Again, we'll use the chain rule.

w000 = D{E,w000} = D(E,y0) * D{a(y0s),y0s} * D{y0s,w000}

D(E,y0) = D{squared_error(z0),y0} + D{squared_error(z1),y0}

Let's compute the first part of the sum:

D{squared_error(z0),y0} = D{squared_error(z0),z0s} * D{squared_error(z0s),y0}

D{squared_error(z0),z0s} = D{squared_error(z0),z0} * D{a(z0s)),z0s}

Note that this last equation uses already computed values which we can reuse. Finally:

D{squared_error(z0s),y0} = D{squared_error(w100 * y0 + w110 * y1),y0} = w100

And we get:

D{squared_error(z0),y0} = D{squared_error(z0),z0} * D{a(z0s)),z0s} * w100

And so on until we get all the derivatives.

Once we have them, we multiply them all by some value (learning rate, a distance by which we move in the computed direction) and subtract them from the current weights by which we perform the gradient descent and lower the total error.

Note that here we've only used one training sample, i.e. the error E was computed from the network against a single desired output. If more example are used in a single update step, they are usually somehow averaged.


bbs

BBS

{ I am too young to remember this shit so I'm just writing what I've read on the web. ~drummyfish }

Bulletin board system (BBS) is, or rather used to be, a kind of server that hosts a community of users who connect to it via terminal, who exchange messages, files, play games and otherwise interact -- BBSes were mainly popular before the invention of web, i.e. from about 1978 to mid 1990s, however some still exist today. BBSes are powered by special BBS software and the people who run them are called sysops.

Back then people connected to BBSes via dial-up modems and connecting was much more complicated than connecting to a server today: you had to literally dial the number of the BBS and you could only connect if the BBS had a free line. Early BBSes weren't normally connected through Internet but rather through other networks like UUCP working through phone lines. I.e. a BBS would have a certain number of modems that defined how many people could connect at once. It was also expensive to make calls into other countries so BBSes were more of a local thing, people would connect to their local BBSes. Furthermore these things ran often on non-multitasking systems like DOS so allowing multiple users meant the need for having multiple computers. The boomers who used BBSes talk about great adventure and a sense of intimacy, connecting to a BBS meant the sysop would see you connecting, he might start chatting with you etc. Nowadays the few existing BBSes use protocols such as telnet, nevertheless there are apparently about 20 known dial-up ones in north America. Some BBSes evolved into more modern communities based e.g. on public access Unix systems -- for example SDF.

A BBS was usually focused on a certain topic such as technology, fantasy roleplay, dating, warez etc., they would typically greet the users with a custom themed ANSI art welcome page upon login -- it was pretty cool.

{ There's some documentary on BBS that's upposed to give you an insight into this shit, called literally BBS: The documentary. It's about 5 hours long tho. ~drummyfish }

The first BBS was CBBS (computerized bulletin board system) created by Ward Christensen and Randy Suess in 1978 during a blizzard storm -- it was pretty primitive, e.g. it only allowed one user to be connected at the time. After publication of their invention, BBSes became quite popular and the number of them grew to many thousands -- later there was even a magazine solely focused on BBSes (BBS Magazine). BBSes would later group into larger networks that allowed e.g. interchange of mail. The biggest such network was FidoNet which at its peak hosted about 35000 nodes.

{ Found some list of BBSes at http://www.synchro.net/sbbslist.html. ~drummyfish }

See Also


beauty

Beauty

Beauty is the quality of being extremely appealing and pleasing. Though the word will likely invoke association with traditional art, in technology, engineering, mathematics and other science beauty is, despite it's relative vagueness and subjectivity, an important aspect of design, and in fact this "mathematical beauty" has lots of times some clearly defined shapes -- for example simplicity is mostly considered beautiful. Beauty is similar to and many times synonymous with elegance.

Beauty can perhaps be seen as a heuristic, a touch of intuition that guides the expert in exploration of previously unknown fields, as we have come to learn that the greatest discoveries tend to be very beautiful (however there is also an opposite side: some people, such as Sabine Hossenfelder, criticize e.g. the pursuit of beautiful theories in modern physics as this approach seems to be have led to stagnation). Indeed, beginners and noobs are mostly concerned with learning hard facts, learning standards and getting familiar with already known ways of solving known problems, they often aren't able to recognize what's beautiful and what's ugly. But as one gets more and more experienced and finds himself near the borders of current knowledge, there is suddenly no guidance but intuition, beauty, to suggest ways forward, and here one starts to get the feel for beauty. At this point the field, even if highly exact and rigorous, has become an art.

What is beautiful then? As stated, there is a lot of subjectivity, but generally the following attributes are correlated with beauty:

Examples of beautiful things include:


bilinear

Bilinear Interpolation

Bilinear interpolation (also bilinear filtering) is a simple way of creating a smooth transition (interpolation) between discrete samples (values) in 2D, it is a generalization of linear interpolation to 2 dimensions. It is used in many places, popularly e.g. in 3D computer graphics for texture filtering; bilinear interpolation allows to upscale textures to higher resolutions (i.e. compute new pixels between existing pixels) while keeping their look smooth and "non-blocky" (even though blurry). On the scale of quality vs simplicity it is kind of a middle way between a simpler nearest neighbour interpolation (which creates the "blocky" look) and more complex bicubic interpolation (which uses yet smoother curves but also requires more samples). Bilinear interpolation can further be generalized to trilinear interpolation (in computer graphics trilinear interpolation is used to also additionally interpolate between different levels of a texture's mipamap) and perhaps even bilinear extrapolation. Many frameworks/libraries/engines have bilinear filtering built-in (e.g. GL_LINEAR in OpenGL).

####OOOOVVVVaaaaxxxxssssffffllllcccc////!!!!;;;;,,,,....----    
####OOOOVVVVaaaaxxxxxssssffffllllcccc////!!!!;;;;,,,,.....----  
####OOOOVVVVaaaaaxxxxssssfffflllllcccc////!!!!!;;;;,,,,....-----
###OOOOOVVVVaaaaaxxxxsssssfffflllllcccc////!!!!!;;;;,,,,,....---
###OOOOVVVVVaaaaaxxxxsssssfffffllllccccc/////!!!!!;;;;,,,,,.....
##OOOOOVVVVVaaaaaxxxxxsssssffffflllllcccc/////!!!!!;;;;;,,,,,...
##OOOOOVVVVVaaaaaxxxxxsssssfffffflllllccccc/////!!!!!;;;;;,,,,,.
#OOOOOOVVVVVaaaaaxxxxxxsssssfffffflllllccccc//////!!!!!;;;;;;,,,
OOOOOOVVVVVVaaaaaaxxxxxssssssfffffflllllcccccc//////!!!!!;;;;;;,
OOOOOOVVVVVVaaaaaaxxxxxxssssssffffffllllllcccccc//////!!!!!!;;;;
OOOOOVVVVVVVaaaaaaxxxxxxsssssssfffffflllllllcccccc///////!!!!!!;
OOOOOVVVVVVaaaaaaaxxxxxxxsssssssffffffflllllllccccccc//////!!!!!
OOOOVVVVVVVaaaaaaaaxxxxxxxsssssssfffffffflllllllccccccc////////!
OOOVVVVVVVVaaaaaaaaxxxxxxxxssssssssffffffffllllllllcccccccc/////
OOVVVVVVVVVaaaaaaaaaxxxxxxxxsssssssssfffffffflllllllllcccccccc//
OVVVVVVVVVVaaaaaaaaaxxxxxxxxxssssssssssffffffffflllllllllccccccc
VVVVVVVVVVaaaaaaaaaaaxxxxxxxxxxssssssssssfffffffffffllllllllllcc
VVVVVVVVVVaaaaaaaaaaaxxxxxxxxxxxxsssssssssssffffffffffffllllllll
VVVVVVVVVVaaaaaaaaaaaaxxxxxxxxxxxxxsssssssssssssffffffffffffflll
VVVVVVVVVaaaaaaaaaaaaaaaxxxxxxxxxxxxxxsssssssssssssssfffffffffff
VVVVVVVVaaaaaaaaaaaaaaaaaxxxxxxxxxxxxxxxxxxsssssssssssssssssffff
VVVVVVVaaaaaaaaaaaaaaaaaaaaaxxxxxxxxxxxxxxxxxxxxssssssssssssssss
VVVVVVaaaaaaaaaaaaaaaaaaaaaaaaaxxxxxxxxxxxxxxxxxxxxxxxxxxsssssss
VVVaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaxxxxxxxxxxxxxxxxxxxxxxxxxxx
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaxxxxxxxxxxxxxxx
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaVVVVVVVVVVVVVVVVVVV
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVV
aaaaaaaaaaaaaaaaaaaaaaaaVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVOOOOOO
aaaaaaaaaaaaaaaaaaaaaVVVVVVVVVVVVVVVVVVVVVVVVVVOOOOOOOOOOOOOOOOO
aaaaaaaaaaaaaaaaaaaaVVVVVVVVVVVVVVVVVVVVOOOOOOOOOOOOOOOOOOOOO###

The above image is constructed by applying bilinear interpolation to the four corner values.

The principle is simple: first linearly interpolate in one direction (e.g. horizontal), then in the other (vertical). Mathematically the order in which we take the dimensions doesn't matter (but it may matter practically due to rounding errors etc.).

Example: let's say we want to compute the value x between the four following given corner values:

1 . . . . . . 5
. . . . . . . .
. . . . . . . . 
. . . . . . . .
. . . . . . . .
. . . . x . . .
. . . . . . . .
8 . . . . . . 3

Let's say we first interpolate horizontally: we'll compute one value, a, on the top (between 1 and 5) and one value, b, at the bottom (between 8 and 3). When computing a we interpolate between 1 and 5 by the horizontal position of x (4/7), so we get a = 1 + 4/7 * (5 - 1) = 23/7. Similartly b = 8 + 4/7 * (3 - 8) = 36/7. Now we interpolate between a and b vertically (by the vertical position of x, 5/7) to get the final value x = 23/7 + 5/7 * (36/7 - 23/7) = 226/49 ~= 4.6. If we first interpolate vertically and then horizontally, we'd get the same result (the value between 1 and 8 would be 6, the value between 5 and 3 would be 25/7 and the final value 226/49 again).

Here is a C code to compute all the inbetween values in the above, using fixed point (no float):

#include <stdio.h>

#define GRID_RESOLUTION 8

int interpolateLinear(int a, int b, int t)
{
  return a + (t * (b - a)) / (GRID_RESOLUTION - 1);
}

int interpolateBilinear(int topLeft, int topRight, int bottomLeft, int bottomRight,
  int x, int y)
{
#define FPP 16 // we'll use fixed point to prevent rounding errors
    
#if 1 // switch between the two versions, should give same results:
  // horizontal first, then vertical
  int a = interpolateLinear(topLeft * FPP,topRight * FPP,x);
  int b = interpolateLinear(bottomLeft * FPP,bottomRight * FPP,x);
  return interpolateLinear(a,b,y) / FPP;
#else
  // vertical first, then horizontal
  int a = interpolateLinear(topLeft * FPP,bottomLeft * FPP,y);
  int b = interpolateLinear(topRight * FPP,bottomRight * FPP,y);
  return interpolateLinear(a,b,x) / FPP;
#endif
}

int main(void)
{
  for (int y = 0; y < GRID_RESOLUTION; ++y)
  {
    for (int x = 0; x < GRID_RESOLUTION; ++x)
      printf("%d ",interpolateBilinear(1,5,8,3,x,y));

    putchar('\n');
  }
    
  return 0;
}

The program outputs:

1 1 2 2 3 3 4 5 
2 2 2 3 3 4 4 5 
3 3 3 3 4 4 4 5 
4 4 4 4 4 4 4 5 
5 5 5 5 5 5 5 4 
6 6 6 6 5 5 5 4 
7 7 7 6 6 5 5 4 
8 8 7 6 6 5 4 3

billboard

Billboard

In 3D computer graphics billboard is a flat image placed in the scene that rotates so that it's always facing the camera. Billboards used to be greatly utilized instead of actual 3D models in old games thanks to being faster to render (and possibly also easier to create than full 3D models), but we can still encounter them even today and even outside retro games, e.g. particle systems are normally rendered with billboards (each particle is one billboard). Billboards are also commonly called sprites, even though that's not exactly accurate.

There are two main types of billboards:

Some billboards also choose their image based on from what angle they're viewed (e.g. an enemy in a game viewed from the front will use a different image than when viewed from the side, as seen e.g. in Doom). Also some billboards intentionally don't scale and keep the same size on the screen, for example health bars in some games.

In older software billboards were implemented simply as image blitting, i.e. the billboard's scaled image would literally be copied to the screen at the appropriate position (this would implement the freely rotating billboard). Nowadays when rendering 3D models is no longer really considered harmful to performance and drawing pixels directly is less convenient, billboards are more and more implemented as so called textured quads, i.e. they are really a flat square 3D model that may pass the same pipeline as other 3D models (even though in some frameworks they may actually have different vertex shaders etc.) and that's simply rotated to face the camera in each frame (in modern frameworks there are specific functions for this).

Fun fact: in the old games such as Doom the billboard images were made from photographs of actual physical models from clay. It was easier and better looking than using the primitive 3D software that existed back then.

Implementation Details

The following are some possibly useful things for implementing billboards.

The billboard's position on the screen can be computed by projecting its center point in world coordinates with modelview and projection matrices, just as we project vertices of 3D models.

The billboard's size on the screen shall due to perspective be multiplied by 1 / (tan(FOV / 2) * z) where FOV is the camera's field of view and z is the billboard's distance from camera's projection plane (which is NOT equal to the mere distance from the camera's position, that would create a fisheye lens effect -- the distance from the projection plane can be obtained from the above mentioned projection matrix). (If the camera's FOV is different in horizontal and vertical directions, then also the billboard's size will change differently in these directions.)

For billboards whose images depends on viewing angle we naturally need to compute the angle. We may do this either in 2D or 3D -- most games resort to the simpler 2D case (only considering viewing angle in a single plane parallel to the floor), in which case we may simply use the combination of dot product and cross product between the normalized billboard's direction vector and a normalized vector pointing from the billboard's position towards the camera's position (dot product gives the cosine of the angle, the sign of cross product's vertical component will give the rest of the information needed for determining the exact angle). Once we have the angle, we quantize (divide) it, i.e. drop its precision depending on how many directional images we have, and then e.g. with a switch statement pick the correct image to display. For the 3D case (possible different images from different 3D positions) we may first transform the sprite's 3D facing vector to camera space with appropriate matrix, just like we transform 3D models, then this transformed vector will (again after quantization) directly determine the image we should use.

When implementing the free rotating billboard as a 3D quad that's aligning with the camera projection plane, we can construct the model matrix for the rotation from the camera's normalized directional vectors: R is camera's right vector, U is its up vector and F is its forward vector. The matrix simply transforms the quad's vertices to the coordinate system with bases R, U and F, i.e. rotates the quad in the same way as the camera. When using row vectors, the matrix is following:

R.x R.y R.z 0
U.x U.y U.z 0
F.x F.y F.z 0
0   0   0   1

bill_gates

Bill Gate$

"Some people are so poor that all they've got is money."

William "Bill" Gaytes (28.10.1955 -- TODO) is a mass murderer and rapist (i.e. capitalist) who established and led the terrorist organization Micro$oft. He is one of the most rich and evil individuals in history who took over the world by force establishing the malware "operating system" Window$ as the common operating system, nowadays being dangerous especially by hiding behind his "charity organization" (see charitywashing) which has been widely criticized (see e.g. http://techrights.org/wiki/Gates_Foundation_Critique, even such mainstream media as Wikipedia present the criticism) but which nevertheless makes him look as someone doing "public good" in the eyes of the naive brainless NPC masses.

     \__.,.
   -."     "./
   /,/---"""\\
   ; __   __ ;
  (=[0 ]"[ 0]=)
  \| ""._,"" |/
   |  .___,  |
    \_,   ._/
      "---"
  dead or alive

ASCII portrait of Bill Gaytes

He is really dumb, only speaks one language and didn't even finish university. He also has no moral values, but that goes without saying for any rich businessman. He was owned pretty hard in chess by Magnus Carlsen on some shitty TV show.

When Bill was born, his father was just busy counting dollar bills, so he named him Bill. Bill was mentally retarded as a child and as such had to attend a private school. He never really understood programming but with a below average intelligence he had a good shot at succeeding in business. Thanks to his family connections he got to Harvard where he met Steve Ballmer.L ater he dropped out of the school due to his low intelligence.

In 1975 he founded Micro$oft, a malware company named after his dick. By a sequence of extremely lucky events combined with a few dick moves by Bill the company then became successful: when around the year 1980 IBM was creating the IBM PC, they came to Bill because they needed an operating system. He lied to them that he had one and sold them a license even though at the time he didn't have any OS (lol). After that he went to a programmer named Tim Paterson and basically stole (bought for some penny) his OS named QDOS and gave it to IBM, while still keeping ownership of the OS (he only sold IBM a license to use it, not exclusive rights for it). He basically fucked everyone for money and got away with it, the American way. For this he is admired by Americans.

When Bill Gates and Steve Jobs saw how enormously rich they got by abusing the whole world, they got horny and had gay sex together, after which Bill legally changed his name to Bill Gaytes. This however gave Jobs ass cancer and he died.


binary

Binary

The word binary in general refers to having two choices; in computer science binary refers to the base 2 numeral system, i.e. a system of writing numbers with only two symbols, usually 1s and 0s. We can write any number in binary just as we can with our everyday decimal system, but binary is more convenient for computers because this system is easy to implement in electronics (a switch can be on or off, i.e. 1 or 0; systems with more digits were tried but unsuccessful, they failed miserably in reliability -- see e.g. ternary computers). The word binary is also by extension used for non-textual computer files such as native executable programs or asset files for games.

One binary digit can be used to store exactly 1 bit of information. So the number of places we have for writing a binary number (e.g. in computer memory) is called a number of bits or bit width. A bit width N allows for storing 2^N values (e.g. with 2 bits we can store 4 values: 0, 1, 2 and 3, in binary 00, 01, 10 and 11).

At the basic level binary works just like the decimal (base 10) system we're used to. While the decimal system uses powers of 10, binary uses powers of 2. Here is a table showing a few numbers in decimal and binary:

decimal binary
0 0
1 1
2 10
3 11
4 100
5 101
6 110
7 111
8 1000
... ...

Conversion to decimal: let's see an example that utilizes the facts mentioned above. Let's have a number that's written as 10135 in decimal. The first digit from the right (5) says the number of 10^(0)s (1s) in the number, the second digit (3) says the number of 10^(1)s (10s), the third digit (1) says the number of 10^(2)s (100s) etc. Similarly if we now have a number 100101 in binary, the first digit from the right (1) says the number of 2^(0)s (1s), the second digit (0) says the number of 2^(1)s (2s), the third digit (1) says the number of 2^(2)s (4s) etc. Therefore this binary number can be converted to decimal by simply computing 1 * 2^0 + 0 * 2^1 + 1 * 2^2 + 0 * 2^3 + 0 * 2^4 + 1 * 2^5 = 1 + 4 + 32 = 37.

100101 = 1 + 4 + 32 = 37
||||||
\\\\\\__number of 2^0s (= 1s)
 \\\\\__number of 2^1s (= 2s)
  \\\\__number of 2^2s (= 4s)
   \\\__number of 2^3s (= 8s)
    \\__number of 2^4s (= 16s)
     \__number of 2^5s (= 32s)

To convert from decimal to binary we can use a simple algorithm that's again derived from the above. Let's say we have a number X we want to write in binary. We will write digits from right to left. The first (rightmost) digit is the remainder after integer division of X by 2. Then we divide the number by 2. The second digit is again the remainder after division by 2. Then we divide the number by 2 again. This continues until the number is 0. For example let's convert the number 22 to binary: first digit = 22 % 2 = 0; 22 / 2 = 11, second digit = 11 % 2 = 1; 11 / 2 = 5; third digit = 5 % 2 = 1; 5 / 2 = 2; 2 % 2 = 0; 2 / 2 = 1; 1 % 2 = 1; 1 / 2 = 0. The result is 10110.

TODO: operations in binary

In binary it is very simple and fast to divide and multiply by powers of 2 (1, 2, 4, 8, 16, ...), just as it is simply to divide and multiple by powers of 10 (1, 10, 100, 1000, ...) in decimal (we just shift the radix point, e.g. the binary number 1011 multiplied by 4 is 101100, we just added two zeros at the end). This is why as a programmer you should prefer working with powers of two (your programs can be faster if the computer can perform basic operations faster).

Binary can be very easily converted to and from hexadecimal and octal because 1 hexadecimal (octal) digit always maps to exactly 4 (3) binary digits. E.g. the hexadeciaml number F0 is 11110000 in binary (1111 is always equaivalent to F, 0000 is always equivalent to 0). This doesn't hold for the decimal base, hence programmers often tend to avoid base 10.

We can work with the binary representation the same way as with decimal, i.e. we can e.g. write negative numbers such as -110101 or rational numbers (or even real numbers) such as 1011.001101. However in a computer memory there are no other symbols than 1 and 0, so we can't use extra symbols such as - or . to represent such values. So if we want to represent more numbers than non-negative integers, we literally have to only use 1s and 0s and choose a specific representation/format/encoding of numbers -- there are several formats for representing e.g. signed (potentially negative) or rational (fractional) numbers, each with pros and cons. The following are the most common number representations:

As anything can be represented with numbers, binary can be used to store any kind of information such as text, images, sounds and videos. See data structures and file formats.

See Also


bit_hack

Bit Hack

Bit hacks or bit tricks are simple clever formulas for performing useful operations with binary numbers. Some operations, such as checking if a number is power of two or reversing bits in a number, can be done very efficiently with these hacks, without using loops, branching and other undesirably slow operations, potentially increasing speed and/or decreasing size and/or memory usage of code -- this can help us optimize. Many of these can be found on the web and there are also books such as Hacker's Delight which document such hacks.

Basics

Basic bit manipulation techniques are common and part of general knowledge so they won't be listed under hacks, but for sake of completeness and beginners reading this we should mention them here. Let's see the basic bit manipulation operators in C:

Specific Bit Hacks

{ Work in progress. I'm taking these from various sources such as the Hacker's Delight book or web and rewriting them a bit, always testing. Some of these are my own. ~drummyfish }

Unless noted otherwise we suppose C syntax and semantics and integer data types. Keep in mind all potential dangers, for example it may sometimes be better to write an idiomatic code and let compiler do the optimization that's best for given platform, also of course readability will worsen etc. Nevertheless as a hacker you should know about these tricks, it's useful for low level code etc.

2^N: 1 << N

absolute value of x (two's complement):

  int t = x >> (sizeof(x) * 8 - 1);
  x = (x + t) ^ t;

average x and y without overflow: (x & y) + ((x ^ y) >> 1) { TODO: works with unsigned, not sure about signed. ~drummyfish }

clear (to 0) Nth bit of x: x & ~(1 << N)

clear (to 0) rightmost 1 bit of x: x & (x - 1)

conditionally add (subtract etc.) x and y based on condition c (c is 0 or 1): x + ((0 - c) & y), this avoids branches AND ALSO multiplication by c, of course you may replace + by another operators.

count 0 bits of x: Count 1 bits and subtract from data type width.

count 1 bits of x (8 bit): We add neighboring bits in parallel, then neighboring groups of 2 bits, then neighboring groups of 4 bits.

  x = (x & 0x55) + ((x >> 1) & 0x55);
  x = (x & 0x33) + ((x >> 2) & 0x33);
  x = (x & 0x0f) + (x >> 4);

count 1 bits of x (32 bit): Analogous to 8 bit version.

  x = (x & 0x55555555) + ((x >> 1) & 0x55555555);
  x = (x & 0x33333333) + ((x >> 2) & 0x33333333);
  x = (x & 0x0f0f0f0f) + ((x >> 4) & 0x0f0f0f0f);
  x = (x & 0x00ff00ff) + ((x >> 8) & 0x00ff00ff);
  x = (x & 0x0000ffff) + (x >> 16);

count leading 0 bits in x (8 bit):

  int r = (x == 0);
  if (x <= 0x0f) { r += 4; x <<= 4; }
  if (x <= 0x3f) { r += 2; x <<= 2; }
  if (x <= 0x7f) { r += 1; }

count leading 0 bits in x (32 bit): Analogous to 8 bit version.

  int r = (x == 0);
  if (x <= 0x0000ffff) { r += 16; x <<= 16; }
  if (x <= 0x00ffffff) { r += 8; x <<= 8; }
  if (x <= 0x0fffffff) { r += 4; x <<= 4; }
  if (x <= 0x3fffffff) { r += 2; x <<= 2; }
  if (x <= 0x7fffffff) { r += 1; }

divide x by 2^N: x >> N

divide x by 3 (unsigned at least 16 bit, x < 256): ((x + 1) * 85) >> 8, we use kind of a fixed point multiplication by reciprocal (1/3), on some platforms this may be faster than using the divide instruction, but not always (also compilers often do this for you). { I checked this particular trick and it gives exact results for any x < 256, however this may generally not be the case for other constants than 3. Still even if not 100% accurate this can be used to approximate division. ~drummyfish }

divide x by 5 (unsigned at least 16 bit, x < 256): ((x + 1) * 51) >> 8, analogous to divide by 3.

get Nth bit of x: (x >> N) & 0x01

is x a power of 2?: x && ((x & (x - 1)) == 0)

is x even?: (x & 0x01) == 0

is x odd?: (x & 0x01)

isolate rightmost 0 bit of x: ~x & (x + 1)

isolate rightmost 1 bit of x: x & (~x + 1) (in two's complement equivalent to x & -x)

log base 2 of x: Count leading 0 bits, subtract from data type width - 1.

maximum of x and y: x ^ ((0 - (x < y)) & (x ^ y))

minimum of x and y: x ^ ((0 - (x > y)) & (x ^ y))

multiply x by 2^N: x << N

multiply by 7 (and other numbers close to 2^N): (x << 3) - x

next higher or equal power of 2 from x (32 bit):

  x--;
  x |= x >> 1;
  x |= x >> 2;
  x |= x >> 4;
  x |= x >> 8;
  x |= x >> 16;
  x = x + 1 + (x == 0);

parity of x (8 bit):

  x ^= x >> 1;
  x ^= x >> 2;
  x = (x ^ (x >> 4)) & 0x01;

reverse bits of x (8 bit): We switch neighboring bits, then switch neighboring groups of 2 bits, then neighboring groups of 4 bits.

  x = ((x >> 1) & 0x55) | ((x & 0x55) << 1);
  x = ((x >> 2) & 0x33) | ((x & 0x33) << 2);
  x = ((x >> 4) & 0x0f) | (x << 4);

reverse bits of x (32 bit): Analogous to the 8 bit version.

  x = ((x >> 1) & 0x55555555) | ((x & 0x55555555) << 1);
  x = ((x >> 2) & 0x33333333) | ((x & 0x33333333) << 2);
  x = ((x >> 4) & 0x0f0f0f0f) | ((x & 0x0f0f0f0f) << 4);
  x = ((x >> 8) & 0x00ff00ff) | ((x & 0x00ff00ff) << 8);
  x = ((x >> 16) & 0x0000ffff) | (x << 16);

rotate x left by N (8 bit): (x << N) | (x >> (8 - N)) (watch out, in C: N < 8, if storing in wider type also do & 0xff)

rotate x right by N (8 bit): analogous to left rotation, (x >> N) | (x << (8 - N))

set (to 1) Nth bit of x: x | (1 << N)

set (to 1) the rightmost 0 bit of x: x | (x + 1)

set or clear Nth bit of x to b: (x & ~(1 << N)) | (b << N)

sign of x (returns 1, 0 or -1): (x > 0) - (x < 0)

swap x and y (without tmp var.): x ^= y; y ^= x; x ^= y; or x -= y; y += x; x = y - x;

toggle Nth bit of x: x ^ (1 << N)

toggle x between A and B: (x ^ A) ^ B

x and y have different signs?: (x > 0) == (y > 0), (x <= 0) == (y <= 0) etc. (differs on 0:0 behavior)

TODO: the ugly hacks that use conversion to/from float?

See Also


bit

Bit

Bit (for binary digit, symbol b, also shannon) is the lowest commonly used unit of information, equivalent to a choice between two equally likely options (e.g. an answer to the question "Was the coin flip heads?"), in computers used as the smallest unit of memory, with 1 bit being able to hold exactly one value that can be either 1 or 0. From bit a higher memory unit, byte (8 bits), is derived. In quantum computing the equivalent of a bit is qubit, in ternary computers the analogy is trit.

Can there exist a smaller quantity of information than 1 bit? Well, yes, for sure we can get zero information and it certainly also makes sense to speak of fractions of bits; for example one decimal digit carries log2(10) ~= 3.32 bits of information. Entropy is also measured in bits and can get smaller than 1 bit, e.g. for an unfair coin toss; an answer to the question "Will the Sun rise tomorrow?" gives less than 1 bit of information -- in fact it gives almost no information as we know the answer will most definitely be yes, though the certainty can never be absolute. Another idea: imagine there exist two people for whom we want to know, based on their sexes, whether they may reproduce together -- answer to this question takes 1 bit (yes or no) and to obtain it we have to know both of these people's sexes so we can say whether they differ. Now if we only know the sex of one of them, then in the context of the desired answer we might perhaps say we have a half of one bit of information, as if we also know the other one's sex (the other half of the bit), we get the whole 1 bit answer.


black

Black

Black, a color whose politically correct name is afroamerican, is a color that we see in absence of any light.


blender

Blender

Blender is an "open-source" 3D modeling and rendering software -- one of the most powerful and "feature-rich" (read bloated) ones, even compared to proprietary competition -- used not only by the FOSS community, but also the industry (commercial games, movies etc.), which is an impressive achievement in itself, however Blender is also a capitalist software suffering from many not-so-nice features such as bloat.

After version 2.76 Blender started REQUIRING OpenGL 2.1 due to its "modern" EEVEE renderer, deprecating old machines and giving a huge fuck you to all users with incompatible hardware (for example the users of RYF software). This new version also stopped working with the free Nouveau driver, forcing the users to use NVidia's proprietary drivers. Blender of course doesn't at all care about this. { I've been forced to use the extremely low FPS software GL version of Blender after 2.8. ~drummyfish }

See Also


bloat

Bloat

Bloat is a very wide term that in the context of software and technology means overcomplication, unnecessary complexity and/or extreme growth in terms of source code size, overall complexity, number of dependencies, redundancy, unnecessary and/or useless features (e.g. feature creep) and resource usage, all of which lead to inefficient, badly designed technology with bugs (e.g. security vulnerabilities or crashes), as well as great obscurity, ugliness, loss of freedom and waste of human effort. Simply put bloat is burdening bullshit. Bloat is extremely bad and one of the greatest technological issues of today. Creating bloat is bad engineering at its worst and unfortunately it is what's absolutely taking over all technology nowadays, mostly due to capitalism causing commercialization, consumerism and incompetent people trying to take on jobs they are in no way qualified to do.

LRS, suckless and some others rather small groups are trying to address the issue and write software that is good, minimal, safe, efficient and well functioning. Nevertheless our numbers are very small and in this endeavor we are basically standing against the whole world and the most powerful tech corporations.

Some have attempted to measure bloat, e.g. the famous web bloat score (https://www.webbloatscore.com/) measures bloat of websites as its total size divided by the page screenshot size (e.g. YouTube at 18.5 vs suckless.org at 0.386). It has been observed that software gets slower faster than hardware gets faster, which is now known as Wirth's law; this follows from Moore's law (speed of hardware doubles every 24 months) being weaker than Gate's law (speed of software halves every 18 months); or in other words: the stupidity of soydevs outpaces the brilliancy of geniuses.

The issue of bloat may of course appear outside of the strict boundaries of computer technology, nowadays we may already observe e.g. science bloat -- science is becoming so overcomplicated (many times on purpose, e.g. by means of bullshit science) that 99% people can NOT understand it, they have to BELIEVE "scientific authorities", which does not at all differ from the dangerous blind religious behavior. Any time a new paper comes out, chances are that not even SCIENTISTS from the same field but with a different specialization will understand it in depth and have to simply trust its results. This combined with self-interest obsessed society gives rise to soyence and large scale brainwashing and spread of "science approved" propaganda.

Back to technology though, one of a very frequent questions you may hear a noob ask is "How can bloat limit software freedom if such software has a free license?" Bloat de-facto limits some of the four essential freedoms (to use, study, modify and share) required for a software to be free. A free license grants these freedoms legally, but if some of those freedoms are subsequently limited by other circumstances, the software becomes effectively less free. It is important to realize that complexity itself goes against freedom because a more complex system will inevitably reduce the number of people being able to execute freedoms such as modifying the software (the number of programmers being able to understand and modify a trivial program is much greater than the number of programmers being able to understand and modify a highly complex million LOC program). As the number of people being able to execute the basic freedom drops, we're approaching the scenario in which the software is de-facto controlled by a small number of people who can (e.g. due to the cost) effectively study, modify and maintain the program -- and a program that is controlled by a small group of people (e.g. a corporation) is by definition proprietary. If there is a web browser that has a free license but you, a lone programmer, can't afford to study it, modify it significantly and maintain it, and your friends aren't able to do that either, when the only one who can practically do this is the developer of the browser himself and perhaps a few other rich corporations that can pay dozens of full time programmers, then such browser cannot be considered free as it won't be shaped to benefit you, the user, but rather the developer, a corporation.

How much bloat can we tolerate? We are basically trying to get the most for the least price. The following diagram attempts to give an answer:

        external
       "richness"
           A   
   shiny   |    :                :
  bullshit | NO :      YES       : NO
           |    :                :                         ____. . .
   luxury  |    :                :             ___________/
           |    :                :    ________/
           |    :              __:___/
    very   |    :         ____/  :
   useful  |    :     ___/       :
           |    :  __/           :
           |    :_/              :
   useful  |   _:                :
           |  | :                :
           | /  :                :
    does   ||   :                :
   nothing +---------------------------------------------------> internal complexity
           trivial simple   solo       big      huge     gigantic
                         manageable

Typical Bloat

The following is a list of software usually considered a good, typical example of bloat. However keep in mind that bloat is a relative term, for example vim can be seen as a minimalist suckless editor when compared to mainstream software (IDEs), but at the same time it's pretty bloated when compared to strictly suckless programs.

Small Bloat

Besides the typical big programs that even normies admit are bloated there exists also a smaller bloat which many people don't see as such but which is nevertheless considered unnecessarily complex by some experts and/or idealists and/or hardcore minimalists, including us.

Small bloat is a subject of popular jokes such as "OMG he uses a unicode font -- BLOAT!!!". These are good jokes, it's nice to make fun out of one's own idealism. But watch out, this doesn't mean small bloat is only a joke concept at all, it plays an important role in designing good technology. When we identify something as small bloat, we don't necessarily have to completely avoid and reject that concept, we may just try to for example make it optional. In context of today's PCs using a Unicode font is not really an issue for performance, memory consumption or anything else, but we should keep in mind it may not be so on much weaker computers or for example post-collapse computers, so we should try to design systems that don't depend on Unicode.

Small bloat includes for example:

Non-Computer Bloat

The concept of bloat can be applied even outside the computing world, e.g. to non-computer technology, art, culture, law etc. Here it becomes kind of synonymous with bullshit, but using the word bloat says we're approaching the issue as computer programmers. Examples include:

See Also


bloat_monopoly

Bloat Monopoly

Bloat monopoly is an exclusive control over or de-facto ownership of software or even a whole area of technology not by legal means but by means of bloat, or generally just abusing bloat in ways that lead to gaining monopolies, e.g. by establishing standards or even legal requirements (such as the EU mandatory content filters) which only the richest may conform to. Even if given sofware is FOSS (that is its source code is public and everyone has basic legal rights to it), it can be malicious due to bloat, for example it can still be made practically controlled exclusively by the developer because the developer is the only one with sufficient resources and/or know-how to be able to execute the basic rights such as meaningful modifications of the software, which goes against the very basic principle of free software.

Example: take a look at the web and how Google is gaining control over it by getting the search engine monopoly. It is very clear web along with web browsers has been becoming bloated to ridiculous levels -- this is not a coincidence, bloat is pushed by corporations such as Google to eliminate possible emerging competition. If practically all websites require JavaScript, CSS, HTTPS and similar nonsense, it is becoming much more difficult to crawl them and create a web index, leaving the possibility to crawl the web mostly to the rich, i.e. those who have enough money, time and know-how to do this. Alongside this there is the web browser bloat -- as websites have become extremely complex, it is also extremely complex to make and maintain a web browser, which is why there is only a few of them, all controlled (despite FOSS licenses) by corporations and malicious groups, one of which is Google itself. For these reasons Google loves bloat and encourages it, e.g. simply by ranking bloated webpages better in their search results, and of course by other means (sponsoring, lobbying, advertising, ...).

Bloat monopoly is capitalism's circumvention of free licenses and taking advantage of their popularity. With bloat monopoly capitalists can stick a FOSS license to their software, get an automatic approval (openwashing) of most "open-source" fanbois as well as their free work time, while really staying in control almost to the same degree as with proprietary software.

Examples of bloat monopoly include mainstream web browsers (furryfox, chromium, ...), Android, Linux, Blender etc. This software is characteristic by its difficulty to be even compiled, yet alone understood, maintained and meaningfully modified by a lone average programmer, by its astronomical maintenance cost that is hard to pay for volunteers, and by aggressive update culture.


body_shaming

Body Shaming

Your body sucks.


brainfuck

Brainfuck

Brainfuck is an extremely simple, untyped esoteric programming language; simple by its specification (consisting only of 8 commands) but intentionally very hard to program in. It works similarly to a pure Turing machine. In a way it is kind of beautiful by its simplicity. It is very easy to write your own brainfuck interpreter.

There exist self-hosted brainfuck interpreters which is pretty fucked up.

The language is based on a 1964 language P´´ which was published in a mathematical paper; it is very similar to brainfuck except for having no I/O.

Brainfuck has seen tremendous success in the esolang community as the lowest common denominator language: just as mathematicians use Turing machines in proofs, esolang programmers use brainfuck in similar ways -- many esolangs just compile to brainfuck or use brainfuck in proofs of Turing completeness etc. This is thanks to brainfuck being an actual, implemented and working language reflecting real computers, not just a highly abstract mathematical model with many different variants. For example if one wants to encode a program as an integer number, we can simply take the binary representation of the program's brainfuck implementation.

In LRS programs brainfuck may be seriously used as a super simple scripting language.

Brainfuck can be trivially translated to comun like this: remove all comments from brainfuck program, then replace +, -, >, <, ., ,, [ and ] with ++ , -- , $>0 , $<0 , ->' , $<0 <- , @' and . , respectively, and prepend $>0 .

Specification

The "vanilla" brainfuck operates as follows:

We have a linear memory of cells and a data pointer which initially points to the 0th cell. The size and count of the cells is implementation-defined, but usually a cell is 8 bits wide and there is at least 30000 cells.

A program consists of these possible commands:

Implementation

This is a very simple C implementation of brainfuck:

#include <stdio.h>

#define CELLS 30000

const char program[] = ",[.-]"; // your program here

int main(void)
{
  char tape[CELLS];
  unsigned int cell = 0;
  const char *i = program;
  int bDir, bCount;
  
  while (*i != 0)
  {
    switch (*i)
    {
      case '>': cell++; break;
      case '<': cell--; break;
      case '+': tape[cell]++; break;
      case '-': tape[cell]--; break;
      case '.': putchar(tape[cell]); fflush(stdout); break;
      case ',': scanf("%c",tape + cell); break;
      case '[':
      case ']':
        if ((tape[cell] == 0) == (*i == ']'))
          break;

        bDir = (*i == '[') ? 1 : -1;
        bCount = 0;
          
        while (1)
        {
          if (*i == '[')
            bCount += bDir;
          else if (*i == ']')
            bCount -= bDir;
          
          if (bCount == 0)
            break;
          
          i += bDir;
        }
        
        break;
      
      default: break;
    }
    
    i++;
  }
}

TODO: comun implementation

Programs

Here are some simple programs in brainfuck.

Print HI:

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ . + .

Read two 0-9 numbers (as ASCII digits) and add them:

,>,[<+>-]<------------------------------------------------.

TODO: more

Variants

TODO


brain_software

Brain Software

Brain software, also brainware, is kind of a fun idea of software that runs on the human brain as opposed to a typical electronic computer. This removes the dependency on computers and highly increases freedom. Of course, this also comes with a huge drop of computational power :) However, aside from being a fun idea to explore, this kind of software and "architectures" may become interesting from the perspective of freedom and primitivism (especially when the technological collapse seems like a real danger).

Primitive tools helping the brain compute, such as pen and paper or printed out mathematical tables, may be allowed.

Example of brain software can be the game of chess. Chess masters can easily play the game without a physical chess board, only in their head, and they can play games with each other by just saying the moves out loud. They may even just play games with themselves, which makes chess a deep, entertaining game that can be 100% contained in one's brain. Such game can never be taken away from the person, it can't be altered by corporations, it can't become unplayable on new hardware etc., making it free to the greatest extent.

One may think of a pen and paper computer with its own simple instruction set that allows general purpose programming. This instruction set may be designed to be well interpretable by human and it may be accompanied by tables printed out on paper for quick lookup of operation results -- e.g. a 4 bit computer might provide a 16x16 table with precomputed multiplication results which would help the person execute the multiplication instruction within mere seconds.

Yet another idea is to make a computer with architecture similar to the typical electronic computers but powered by human brains -- let's call this a human computer (not to be confused with people whose job was to perform computations!). Imagine that after a societal collapse we lose our computer technology (i.e. the ability to manufacture transistors and similar key components), but we retain our knowledge of computer architecture, algorithms and the usefulness of computers. As a temporary solution for performing computations we may create a "computer made of humans", a room with several men, each one performing a role of some computer component, for example an ALU, cache and memory controller. Again, a special instruction set and a set of tools (such as physical lookup tables for results of instructions) could be made to make such a human computer relatively fast. It might not run Doom, but it could possibly e.g. compute some mathematical constants to a high precision or perhaps help find optimal structure of cities, compute stresses in big building etc. In such conditions even a slow calculator could be immensely useful.


bs

BS

Bullshit.


bullshit

Bullshit


bytebeat

Bytebeat

Bytebeat is a procedural chiptune/8bit style music generated by a short expression in a programming language; it was discovered/highlighted in 2011 by Viznut (author of countercomplex blog) and others, and the technique capable of producing quite impressive music by single-line code has since caught the attention of many programmers, especially in demoscene. There has even been a paper written about bytebeat. Bytebeat can produce music similar (though a lot simpler) to that created e.g. with music trackers but with a lot less complexity and effort.

This is a beautiful hack for LRS/suckless programmers because it takes quite a tiny amount of code, space and effort to produce nice music, e.g. for games (done e.g. by Anarch).

8bit samples corresponding to unsigned char are typically used with bytebeat. The formulas take advantage of overflows that create rhythmical patterns with potential other operations such as multiplication, division, addition, squaring, bitwise/logical operators and conditions adding more interesting effects.

Bytebeat also looks kind of cool when rendered as an image (outputting pixels instead of audio samples).

How To

Quick experiments with bytebeat can be performed with online tools that are easy to find on the web, these usually use JavaScript.

Nevertheless, traditionally we use C for bytebeat. We simply create a loop with a time variable (i) and inside the loop body we create our bytebeat expression with the variable to compute a char that we output.

A simple "workflow" for bytebeat "development" can be set up as follows. Firstly write a C program:

#include <stdio.h>

int main(void)
{
  for (int i = 0; i < 10000; ++i)
    putchar(
      i / 3 // < bytebeat formula here
    );

  return 0;
}

Now compile the program and play its output e.g. like this:

gcc program.c && ./a.out | aplay

Now we can just start experimenting and invent new music by fiddling with the formula indicated by the comment.

General tips/tricks and observations are these:

Copyright

It is not exactly clear whether, how and to what extent copyright can apply to bytebeat: on one hand we have a short formula that's uncopyrightable (just like mathematical formulas), on the other hand we have music, an artistic expression. Many authors of bytebeat "release" their creations under free licenses such as CC-BY-SA, but such licenses are of course not applicable if copyright can't even arise.

We believe copyright doesn't and SHOULDN'T apply to bytebeat. To ensure this, it is good to stick CC0 to any released bytebeat just in case.

Examples

A super-simple example can be just a simple:

The following more complex examples come from the LRS game Anarch (these are legally safe even in case copyright can apply to bytebeat as Anarch is released under CC0):

See Also


bytecode

Bytecode

TODO

Example

Let's consider a simple algorithm that tests the Collatz conjecture (which says that applying a simple operation from any starting number over and over will always lead to number 1). The algorithm in C would look as follows:

// Collatz conjecture
#include <stdio.h>

int next(int n)
{
  return n % 2 ? // is odd?
    3 * n + 1 :
    n / 2;
}

int main(void)
{
  int n = getchar() - '0'; // read input ASCII digit

  while (1)
  {
    printf("%d\n",n);

    if (n == 1)
      break;

    n = next(n);
  }

  return 0;
}

The program reads a number (one digit for simplicity) and then prints the sequence until reaching the final number 1. Now let's rewrite the same algorithm in comun, a language which will allow us to produce bytecode:

# Collatz conjecture

next:
  $0 2 % ? # is odd?
    3 * 1 +
  ;
    2 /
  .
.

<-     # read input ASCII digit
"0" -  # convert it to number

@@
  # print:
  $0 10 / "0" + ->
  $0 10 % "0" + ->
  10 ->

  $0 1 = ?
    !@
  .
 
  next
.

The bytecode this compiles to is following:

000000: DES  00 0111    # func
000001: JMA  00 0100... # 20 (#14)
000002: COC  00 0001
000003: MGE  00 0000
000004: CON' 00 0010    # 2 (#2)
000005: MOX  00 0000
000006: DES  00 0001    # if
000007: JNA  00 0000... # 16 (#10)
000008: COC  00 0001
000009: CON' 00 0011    # 3 (#3)
00000a: MUX  00 0000
00000b: CON' 00 0001    # 1 (#1)
00000c: ADX  00 0000
00000d: DES  00 0010    # else
00000e: JMA  00 0011... # 19 (#13)
00000f: COC  00 0001
000010: CON' 00 0010    # 2 (#2)
000011: DIX  00 0000
000012: DES  00 0011    # end if
000013: RET  00 0000
000014: INI  00 0000
000015: INP  00 0000
000016: CON' 00 0000... # 48 (#30)
000017: COC  00 0011
000018: SUX  00 0000
000019: DES  00 0100    # loop
00001a: MGE  00 0000
00001b: CON' 00 1010    # 10 (#a)
00001c: DIX  00 0000
00001d: CON' 00 0000... # 48 (#30)
00001e: COC  00 0011
00001f: ADX  00 0000
000020: OUT  00 0000
000021: MGE  00 0000
000022: CON' 00 1010    # 10 (#a)
000023: MOX  00 0000
000024: CON' 00 0000... # 48 (#30)
000025: COC  00 0011
000026: ADX  00 0000
000027: OUT  00 0000
000028: CON' 00 1010    # 10 (#a)
000029: OUT  00 0000
00002a: MGE  00 0000
00002b: CON' 00 0001    # 1 (#1)
00002c: EQX  00 0000
00002d: DES  00 0001    # if
00002e: JNA  00 0100... # 52 (#34)
00002f: COC  00 0011
000030: DES  00 0101    # break
000031: JMA  00 1000... # 56 (#38)
000032: COC  00 0011
000033: DES  00 0011    # end if
000034: CAL  00 0011    # 3 (#3)
000035: DES  00 0110    # end loop
000036: JMA  00 1010... # 26 (#1a)
000037: COC  00 0001
000038: END  00 0000

TODO: analyze the above, show other bytecodes (python, java, ...)


byte

Byte

Byte (symbol: B) is a basic unit of information, nowadays practically always consisting of 8 bits (in which case it is also called an octet), which allow it to store 2^8 = 256 distinct values (for example a number in range 0 to 255). It is usually the smallest unit of memory a CPU is able to operate on, and memory addresses are assigned by one byte steps. We use bytes to measure the size of memory and derive higher memory units from it, such as a kilobyte (kB, 1000 bytes), kibibyte (KiB, 1024 bytes), megabyte (MB, 10^6 bytes) etc. In programming a one byte variable is nowadays seen as very small and used if we are really limited by memory constraints (e.g. embdedded) or to mimic older 8bit computers ("retro games" etc.): one byte can be used to store very small numbers (while in mainstream processors numbers nowadays mostly have 4 or 8 bytes), text characters (ASCII, ...), very primitive colors (see RGB332, palettes, ...) etc.

Historically byte was used to stand for the basic addressable unit of memory that could store one text character or another "basic value" and could therefore have a different size than 8 bits: e.g. ASCII machines might have had a 7bit byte, 16bit machines a 16bit byte etc.; in C (standard 99) char is the "byte" data type, its byte size is always 1 (sizeof(char) == 1), though its number of bits (CHAR_BIT) can be greater or equal to 8; if you need an exact 8bit byte use types such as int8_t and uint8_t from the standard stdint library. From now on we will implicitly talk about 8bit bytes.

Value of one byte can be written exactly with two hexadecimal digits with each digit always corresponding to higher/lower 4 bits, making mental conversions very easy; this is very convenient compared to decimal representation, so programmers prefer to write byte values in hexadecimal. For example a byte whose binary value is 11010010 is D2 in hexadecimal (1101 is always D and 0010 is always 2), while in decimal we get 210.

Byte frequency/probability: it may be interesting and/or useful (e.g. for compression) to know how often different byte values appear in the data we process with computers -- indeed, this always DEPENDS; if we are working with plain ASCII text, we will never encounter values above 127, and on the other hand if we are processing photos from a polar expedition, we will likely mostly encounter byte values of 255 (as snow will cause most pixels to be completely white). In general we may expect values such as 0, 255, 1 and 2 to be most frequent, as many times these are e.g. assigned special meanings in data encodings, they may be cutoff values etc. Here is a table of measured byte frequencies in real data:

{ Measured by me. ~drummyfish }

type of data least c. 2nd least c. 3rd least c. 3rd most c. 2nd most c. most c.
GNU/Linux x86 executable 0x9e (0%) 0xb2 (0%) 0x9a (0%) 0x48 (2%) 0xff (3%) 0x00 (32%)
bare metal ARM executable 0xcf (0%) 0xb7 (0%) 0xa7 (0%) 0xff (2%) 0x01 (3%) 0x00 (15%)
UTF8 English txt book 0x00 (0%) 0x01 (0%) 0x02 (0%) 0x74 (t, 6%) 0x65 (e, 8%) 0x20 (, 14%)
C source code 0x00 (0%) 0x01 (0%) 0x02 (0%) 0x31 (1, 6%) 0x20 (, 12%) 0x2c (,, 16%)
raw 24bit RGB photo image 0x07 (0%) 0x09 (0%) 0x08 (0%) 0xdd (0%) 0x00 (1%) 0xff (25%)

cancer

Cancer

Cancer is similar to shit but is even worse because it spreads itself and infects anything else it touches (it is a subset of shit).

See Also


capitalism

$$$Capitalism$$$

Capitalism is how you enslave people with their approval.

Capitali$m is the worst socioeconomic system we've yet seen in history,^source based on pure greed, culture of slavery and artificially sustained conflict between everyone in society (so called competition), abandoning all morals and putting money and profit (so called capital) above everything else including preservation of life itself, capitalism fuels the worst in people and forces them to compete and suffer for basic resources, even in a world where abundance of resources is already possible to achieve -- of course, capitalism is a purely rightist idea. Capitalism goes against progress (see e.g. antivirus paradox), good technology and freedom, it supports immense waste of resources, wars, abuse of people and animals, destruction of environment, decline of morals, deterioration of art, invention of bullshit (bullshit jobs, bullshit laws, ...), utilizing and perfecting methods of torture, brainwashing, censorship and so on. In a sense capitalism can be seen as slavery 2.0 or universal slavery, a more sophisticated form of slavery, one which denies the label by calling itself the polar opposite ("freedom") and manipulates people into them approving and voluntarily parttaking in their own enslavement (capitalist slaves are called wage slaves or wagies). However wage and consumption slavery is only a small part of capitalist dystopia -- capitalism brings on destruction basically to every part of civilization. It it also often likened to a cancer of society; one that is ever expanding, destroying everything with commercialism, materialism, waste and destruction, growing uncontrollably with the sole goal of just never stop an ever accelerating growth. Nevertheless, it's been truthfully stated that "it is now easier to imagine the end of all life than any substantial change in capitalism." Another famous quote is that "capitalism is the belief that the worst of men driven by the nastiest motives will somehow work for the benefit of everyone", which describes its principle quite well.

{ Some web bashing capitalism I just found: http://digdeeper.club/articles/capitalismcancer.xhtml, read only briefly, seems to contain some nice gems capturing the rape of people. ~drummyfish }

Capitalism is fundamentally flawed and CANNOT be fixed -- capitalists build on the idea that competition will drive society, that market will be self sustaining, however capitalism itself works for instating the rule of the winners who eliminate their competition, capitalism is self destabilizing, i.e. the driving force of capitalism is completely unsustainable and leads to catastrophic results as those who get ahead in working competition are also in advantage -- as it's said: money makes money, therefore money flow from the poor to the rich and create a huge imbalance in which competition has to be highly forced, eventually completely arbitrarily and in very harmful ways (invention of bullshit jobs, creating artificial needs and hugely complex state control and laws). It's as if we set up a race in which those who get ahead start to also go faster, and those become the ones who oversee and start to create the rules of the race -- expecting a sustained balance in such a race is just insanity. Society tries to "fight" this emerging imbalance with various laws and rules of market, but this effort is like trying to fight math itself -- the system is mathematically destined to be unstable, pretending we can win over laws of nature themselves is just pure madness.

Capitalism produces the worst imaginable technology and rewards people for being cruel to each other. It points the direction of society towards a collapse and may very likely be the great filter of civilizations; in capitalism people de-facto own nothing and become wholly dependent on corporations which exploit this fact to abuse them as much as possible. This is achieved by slowly boiling the frog. No one owns anything, products become services (your car won't drive without Internet connection and permission from its manufacturer), all independency and decentralization is lost in favor of a highly fragile and interdependent economy and infrastructure of services, each one controlled by the monopoly corporation. Then only a slight break in the chain is enough to bring the whole civilization down in a spectacular domino effect.

The underlying issue of capitalism is competition and conflict -- competition is the root of all evil in any social system, however capitalism is the absolute glorification of competition, amplification of this evil to maximum. It is implemented by setting and supporting a very stupid idea that everyone's primary and only goal is to be self-benefit, i.e. maximization of capital. This is combined with the fact that the environment of free market is a an evolutionary system which through natural selection extremely effectively and quickly optimizes the organisms (corporations) for achieving this given goal, i.e. generating maximum profit, on the detriment of all other values such as wellbeing of people, sustainability or morality. In other words capitalism has never promised a good society, it literally only states that everyone should try to benefit oneself as much as possible, i.e. defines the fitness function purely as the ability to seize as many resources as possible, and then selects and rewards those who best implement this function, i.e. those we would call sociopaths or "dicks", and to those is given the power in society. Yes, this is how nature works, but it must NOT be how a technologically advanced civilization with unlimited power of destruction should work. In other words we simply get what we set to achieve: find entities that are best at making profit at any cost. The inevitable decline of society can not possibly be prevented by laws, any effort of trying to stop evolution by inventing artificial rules on the go is a battle against nature itself and is extremely naive, the immense power of the evolutionary system that's constantly at work to find ways to bypass or cancel laws in the way of profit and abuse of others will prevails just as life will always find its way to survive and thrive even in the worst conditions on Earth. Trying to stop corporations with laws is like trying to stop a train by throwing sticks in its path. The problem is not that "people are dicks", it is that we choose to put in place a system that rewards the dicks, a system that fuels the worst in people and smothers the best in them.

Even though nowadays quite a lot of time has passed since times of Marx and capitalism has evolved to a stage with countless disastrous issues Marx couldn't even foresee, it is useful to mention one of the basic and earliest issues identified by Marx, which is that economically capitalism is based on stealing the surplus value, i.e. abuse of workers and consumers by owners of the means of production (factories, tools, machines etc.) -- a capitalist basically takes money for doing nothing, just for letting workers use tools he proclaims to own (a capitalist will proclaim to "own" land that he never even visited, machines he didn't make as they were developed over centuries, nowadays he even claims to own information and ideas) -- as Kropotkin put it: the working man cannot purchase with his wage the wealth he has produced. This allows a capitalist oppressor to make exponentially more money for nothing and enables existence of monstrously rich and powerful individuals in a world where millions are starving -- consider for example that nowadays there are people who own hundreds of buildings and cars plus a handful of private planes and a few private islands. It is not possible for any single human to work an equivalent of effort that's needed to produce what such an individual owns, even if he worked 24 hours a day for his whole life, he wouldn't get even close to matching the kind of effort that's needed to build the hundreds of buildings he owns -- any such great wealth is always stolen from countless workers whose salary is less than what's adequate for their work and also from consumers who pay more than it really costs to manufacture the goods they buy. Millions of people are giving their money (resources) for free to someone who just proclaims to "own" tools and even natural resources that have been there for billions of years. The difference in wealth and privileges this wealth provides divides society into antagonist classes that are constantly at war -- traditionally these classes are said to be the bourgeoisie (business owners, the upper class) and the proletariat (workers, the lower class), though under modern capitalism the division of society is not so simple anymore -- there are more classes (for example small businesses work for larger businesses) but they are still all at war.

Nowadays capitalism is NOT JUST an economic system anymore. Technically perhaps, however in reality it takes over society to such a degree that it starts to redefine very basic social and moral values to the point of taking the role of a religion, or better said a brainwashing cult in which people are since childhood taught (e.g. by constant daily exposure to private media) to worship economy, brands, engage in cults of personalities (see myths about godlike entrepreneurs) and productivity (i.e. not usefulness, morality, efficiency or similar values, just the pure ability to produce something for its own sake). Close minded people will try to counter argue in shallow ways such as "but religion has to have some supernatural entity called God" etc. Again, technically speaking this may be correct, but if we don't limit our views by arbitrary definitions of words, we see that the effects of capitalism on society are de facto of the same or even greater scale than those of religion, and they are certainly more negative. Capitalism itself works towards suppressing traditional religions (showing it is really competing with them and therefore aspiring for the same role) and their values and trying to replace them with worship of money, success and self interest, it permeates society to the deepest levels by making every single area of society a subject of business and acting on the minds of all people in the society every single day which is an enormously strong pressure that strongly shapes mentality of people, again mostly negatively towards a war mentality (constant competition with others), egoism, materialism, fascism, pure pursuit of profit etc.

From a certain point of view capitalism is not really a traditional socioeconomic system, it is the failure to establish one -- capitalism is the failure to prevent the establishment of capitalism, and it is also the punishment for this failure. It is the continuation of the jungle to the age when technology for mass production, mass surveillance etc. has sufficiently advanced -- capitalism will arise with technological progress unless we prevent it, just as cancer will grow unless we treat it in very early stages. This is what people mean when they say that capitalism simply works or that it's natural -- it's the least effort option, one that simply lets people behave like animals, except that these animals are now equipped with weapons of mass destruction, tools for implementing slavery, world wide surveillance etc. It is natural in the same way in which wars, murders, bullying and deadly diseases are. It is the most primitive system imaginable, it is uncontrolled, leads to suffering and self-destruction.

Under capitalism you are not a human being, you are a resource, at best a machine that's useful for some time but becomes obsolete and undesired once it outlives its usefulness and potential to be exploited. Under capitalism you are a slave that's forced to live the life of 3 Cs: conform, consume, compete. Or, as Encyclopedia dramatica puts it: work, buy, consume, die.

Who invented capitalism? Well, it largely developed on its own, society is just responsible for not stopping it. Capitalism as seen today has mostly evolved from the tradition of small trade, slavery, markets, competition, evil, war and abuse due to societal hierarchy (e.g. peasants by noblemen, poor by rich etc.), combined with technological progress of industrial revolution (18th. - 19th century) which allowed mass production and mass abuse of workers, as well as the information revolution (20th - 21th century) which allowed mass surveillance, unlimited corporate control, acceleration of bullshit business and extreme mass brainwashing, reaching capitalist singularity. Adam Smith (18th century), a mentally retarded egoist who tried to normalize and promote self-interest and torture of others for self-benefit, is often called the "father of capitalism" (which is about of the same honor as being called the father of holocaust) -- this man is to be also largely credited for the future extermination of all life.

On capitalism and Jews: rightists believe the issues caused by capitalism are really caused by Jews and that somehow getting rid of Jews will fix society -- actually this is not entirely accurate; white rightists want to remove Jews so that they (the white race) can take their place in ruling the society, so they don't actually want to fix or remove capitalism (on the contrary, they love its presence and its mechanisms), they just want to became the masters instead of slaves. It is definitely true Jews are overrepresented in high positions of a capitalist society, but that's just because Jews as a race really developped the best "skills" to succeed in capitalism as they historically bet on the right cards (focus on trade and money, decentralization of business, spread across the world and globalization, ...) and really evolved to the race best suited for the winners of the capitalist game. So while the rightist may be correct in the observation that Jews are winning the game, we of course cannot agree with their supposed "fix" -- we do not want to remove the slave masters and replace them with different ones, we want to get rid of capitalism, the unethical system itself which enables slavery in the first place.

{ There is a famous 1988 movie called They Live which, while being a funny alines'n'stuff B movie, actually deeply analyzes and criticizes capitalism and for its accurate predictions of the future we now live in became a cult classic. It's been famously said that They Live is rather a documentary. I highly recommend giving it a watch. ~drummyfish }

Attributes Of Capitalism

The following is a list of just SOME attributes of capitalism -- note that not all of them are present in initial stages but capitalism will always converge towards them.

How It Works

The "old" capitalism, or perhaps its basic forms, that socialist writers have analyzed very well is characterized mainly by abuse of workers by capitalists who declare to "own" means of production such as factories, land and machines -- as e.g. Kropotkin has written in The Conquest of Bread, it is poverty that drives capitalism because only a poor man who just needs ANY salary for himself and his family will accept horrible working conditions and low pay, simply because he has no other choice -- a capitalist exploits this, "employs" (enslaves) the poor and then only pays them as much as to keep the barely alive and working for him, and he further has the audacity of calling himself an "altruist" who "feeds" people and "gives them a work"; a capitalist employs workers in his factory like he employs chicken in egg factories or pigs in slaughterhouses -- in modern days many may fall to the illusion that workers aren't poor anymore as they may posses smartphones and big screen TVs, but in essence a worker still lives salary to salary and is in desperate need of it; without a salary he will quickly end up starving in the street. Workers do labor that's in itself worth a lot but the capitalist only gives him a small salary, firstly to gain own profit and secondly to keep the worker poor because again, only a poor man will work for him. This is also why capitalists are against anything that would end poverty, such as universal basic income. If the workers owned the factory collectively and didn't have to cut the profit off their labor, they wouldn't have to work so many hours in such harsh conditions at all, it's only because there is a capitalist leech at the top that everyone has to slave himself to death so that the leech can get enormously rich.

Here a capitalist says to the worker: "I am not forcing you to slavery, if you don't like the working conditions, go elsewhere". Of course, this is a laughable insult -- the capitalist knows very well there is nowhere else to go; wherever you go work in capitalism, you get exploited -- you can only do as much as choose your slavemaster. A capitalist will then say: "start your own business then", which again is a complete idiocy -- it's extremely hard to succeed in business, not everyone can do it, those who have established businesses won't let anyone on the market, and of course it's the immoral thing to do, the capitalist is just telling you to start doing what he's doing: abuse others. If you do start your business, he will be sure to attack you as a competition and with his power he will very likely be able to stop your business. So this advice is similar to that of "go start your own country if you don't like this one" -- he might as well tell you to move to another planet.

The new, modern capitalism is yet worse as it takes full advantage of technology never before seen in history which allows extreme increase of exploitation of both workers and consumers -- now there are cameras and computers watching each worker's individual production, there are "smart" devices spying on people and then forcing ads on them, there are loud speakers and screens everywhere full of propaganda and brainwashing, nowhere to escape. Now a single capitalist can watch over his factories all over the world through Internet, allowing for such people to get yet much richer than we could ever imagine.

While the old capitalism was more of a steady slavery and the deterioration of society (life environment, morality, art, ...) by it was relatively slow (i.e. it seemed to be somewhat "working"), nowadays, in the new capitalism the downfall of society accelerates immensely. In countries where capitalism is newly instated, e.g. after the fall of an old regime, it indeed seem to be "working" for a short time, however it will never last -- initially when more or less everyone is at the same start line, when there are no highly evolved corporations with their advanced methods of oppression, small businesses grow and take their small shares of the market, there appears true innovation, businesses compete by true quality of products, people are relatively free and it all feels natural because it is, it's the system of the jungle, i.e. as has been said, capitalism is the failure to establish a controlled socioeconomic system rather than a presence of a purposefully designed one. Its benefits for the people are at this point only a side effect, people see it as good and continue to support it. However the system has other goals of its own, and that is the development and constant growth that's meant to create a higher organism just like smaller living cells formed us, multi cell organisms. The system will start being less and less beneficial to the people who will only become cells in a higher organism to which they'll become slaves. A cell isn't supposed to be happy, it is supposed to sacrifice its life for the good of the higher organism.

{ This initial prosperous stage appeared e.g. in Czechoslovakia, where I lived, in the 90s, after the fall of the totalitarian regime. Everything was beautiful, sadly it didn't last longer than about 10 years. ~drummyfish }

Slowly "startups" evolve to medium sized businesses and a few will become the big corporations. These are the first higher entities that have an intelligence of their own, they are composed of humans and technology who together work solely for the corporation's further growth and profit. A corporation has a super human intelligence (combined intelligence of its workers) but has no human emotion or conscience (which is suppressed by the corporation's structure), it is basically the rogue AI we see in sci-fi horror movies. Corporation selects only the worst of humans for the management positions and has further mechanisms to eliminate any effects of human conscience and tendency for ethical behavior; for example it works on the principle of "I'm just doing my job": everyone is just doing a small part of what the whole company is doing so that no one feels responsible for the whole or sometimes doesn't even know what he's part of. If anyone protests, he's replaced with a new hire. Of course, many know they're doing something bad but they have no choice if they want to feed their families, and everyone is doing it.

Deterioration of society is fast now but people are kept in a false sense of a feeling that "it's just a temporary thing", "it's this individual's fault (not the system's)" and that "it's slowly getting better", mainly with the help of 24/7 allmighty media brainwashing. Due to heavy greenwashing, openwashing etc. most people are for example naively convinced that corporations are becoming more "environment friendly", "responsible", "open source" ("Microsoft isn't what it used to be", ...) etc., as if a corporation had something aking emotion instead of pure desire for profit which is its only goal by definition. A corporation will repeat ads telling you it is paying black handicapped gays to plant trees but internally no one gives a shit about anything but making more money, a manager's job is just to increase profit, waste is increasing and dumped to oceans when no one is looking, bullshit is being invented to kickstart more bullshit business which leads to more need for energy wasting (unnecessary transportation, upkeep of factories and workplaces, invention of bullshit technology to solve artificial problems arising from artificial bullshit). A lie repeated 1000 times a day will beat even truth that's evident to naked eye, basic logic and common sense. Even when sky is littered with ads, cities are burning and people are working 20 hours a day, a capitalist will keep saying "this is a good society", "we are just in a temporary crisis", "it is getting better" and "I care about the people", and people will take it as truth.

Corporations make calculated decisions to eliminate any competition, they devour or kill smaller businesses with unfair practices (see e.g. the Microsoft's infamous EEE), more marketing and by other means, both legal and illegal. They develop advanced psychological methods and extort extreme pressure such as brainwashing by ads to the population to create an immensely powerful propaganda that bends any natural human thinking. With this corporations no longer need to satisfy the demand, they create the demand arbitrarily. They create artificial scarcity, manipulate the market, manipulate the people, manipulate laws (those who make laws are nowadays mostly businessmen who want to strengthen corporations whose shares they hold and if you believe voters can somehow prevent such psychopaths getting this power, just take a look literally at any parliament of any country). At this point they've broken the system, competition no longer works as idealized by theoretical capitalists, corporations can now do practically anything they want.

This is an evolutionary system in which the fitness function is simply the ability to make capital. Entities involved in the market are simply chosen by natural selection to be the ones that best make profit, i.e. who are best at circumventing laws, brainwashing, hiding illegal activities etc. Ethical behavior is a disadvantage that leads to elimination; if a business decides to behave ethically, it is outrun by the one who doesn't have this weakness.

The unfair, unethical behavior of corporations is still supposed to be controlled by the state, however corporations become stronger and bigger than states, they can manipulate laws by lobbying, financially supporting preferred candidates, favoring them with their propaganda etc. States are the only force left supposed to protect people from this pure evil, but they are too weak; a single organization of relatively few people who are, quite importantly, often corporation share holder, won't compete against a plethora of the best warriors selected by the extremely efficient system of free market. Furthermore voters, those who are supposed to choose their protectors, are just braindead zombies now who literally do what their cellphones shows them on its display. By all this states slowly turn to serving corporations, becoming their tools and then slowly dissolve (see how small role the US government already plays). Capitalist brainwashing is so strong that it even makes people desire more torture -- see so called "anarcho" capitalism which the stupidest of our population have already fallen for and which is basically about saying "let's get rid of anything that protects us against absolute capitalist apocalypse". "Anarcho" capitalism is the worst stage of capitalism where there is no state, no entity supposed to protect the people, there is only one rule and that is the unlimited rule of the strongest corporation which has at its hands the most advanced technology there ever was.

Here the strongest corporation takes over the world and starts becoming the higher organism of the whole Earth, capitalist singularity has been reached. The world corporation doesn't have to pretend anything at this point, it can simply hire an army, it can use physical force, chemical weapons, torture, unlimited surveillance, anything to achieve further seize of remaining bits of power and resources.

People will NOT protest or revolt at this point, they will accept anything that comes and even if they suffer everyday agony and the system is clearly obviously set up for their maximum exploitation, they will do nothing -- in fact they will continue to support the system and make it stronger and they will see more slavery as more freedom; this tendency is already present in rightists today. You may ask why, you think that at some point people will have enough and will seize back their power. This won't happen, just as the billions of chicken and pigs daily exploited at factories won't ever revolt -- firstly because the system will have absolute control over people at this point, they will be 100% dependent on the system even if they hate it, they will have proprietary technology as part of their bodies (which they willingly admitted to in the past as part of bigger comfort while ignoring our warnings about loss of freedom), they will be dependent on drugs of the system (called "vaccines" or "medicine"), air that has to be cleaned and is unbreathable anywhere one would want to escape, 100% of communication will be monitored to prevent any spark of revolution etc. Secondly the system will have rewritten history so that people won't see that life used to be better and bearable -- just as today we think we live in the best times of history due to the interpretation of history that was force fed us at schools and by other propaganda, in the future a human in every day agony will think history was even worse, that there is no other option than for him to suffer every day and it's a privilege he can even live that way.

We can only guess what will happen here, a collapse due to instability or total destruction of environment is possible, which would at least save the civilization from the horrendous fate of being eternally tortured. If the system survives, humans will be probably be more and more genetically engineered to be more submissive, further killing any hope of a possible change, surveillance chips will be implanted to everyone, reproduction will be controlled precisely and finally perhaps the system will be able, thanks to an advanced AI, to exist and work more efficiently without humans completely, so they will be eliminated. This is how the mankind ends.

{ So here you have it -- it's all here for anyone to read, explained and predicted correctly and in a completely logical way, we even offer a way to prevent this and fix the system, but no one will do it because this will be buried and censored by search engines and the 0.0000000000001% who will find this by chance will dismiss it due to the amount of brainwashing that's already present today. It's pretty sad and depressive, but what more can we do? ~drummyfish }

Capitalist Propaganda And Fairy Tales

Capitalist brainwashing is pretty sophisticated -- unlike with centralized oppressive regimes, capitalism has a decentralized way of creating and spreading propaganda, in ways similar to for example self-replicating and self-modifying malware in the world of software. Creators and promoters of capitalist propaganda are mostly people who are unaware of doing so, they have been brainwashed and programmed by the system itself to behave that way, for example just by being exposed to hearing the capitalist fairy tales since they were born. Some examples of common capitalist propaganda you will probably encounter are the following:

So What To Replace Capitalism With?

See less retarded society.


capitalist_singularity

Capitalist Singularity

Capitalist singularity is a point in time at which capitalism becomes irreversible and the cancerous growth of society unstoppable due to corporations taking absolute control over society. It is when people lose any power to revolt against corporations as corporations become stronger than states and any other collective effort towards their control.

This is similar to the famous technological singularity, the difference being that society isn't conquered by a digital AI but rather a superintelligent entity in a form of corporation. While many people see the danger of superintelligent AIs, surprisingly not many have noticed that we've already seen rise of such AIs -- corporations. A corporation is an entity much more intelligent than any single individual, with the single preprogrammed goal of profit. A corporation doesn't have any sense of morals as morals are an obstacle towards making profit. A corporation runs on humans but humans don't control them; there are mechanisms in place to discourage moral behavior of people inside corporations and anyone exhibiting such behavior is simply replaced.


capitalist_software

Capitalist Software

Capitalist software is software that late stage capitalism produces and is practically 100% shitty modern bloat and malware hostile to its users, made with the sole goal of benefiting its creator (often a corporation). Capitalist software is not just proprietary corporate software, but a lot of times "open source", indie software and even free software that's just infected by the toxic capitalist environment -- this infection may come deep even into the basic design principles, even such things as UI design, priorities and development practices and subtle software behavior which have simply all been shaped by the capitalist pressure on abusing the user.

{ Seriously I don't have enough brain to understand how anyone can accept this shit. ~drummyfish }

Capitalist software largely mimics in technology what capitalist economy is doing in society -- for example it employs huge waste of resources (computing resources such as RAM and CPU cycles as an equivalent to natural resources) in favor of rapid growth (accumulation of "features"), it creates hugely complex, interdependent and fragile ever growing networks (tons of library of hardware dependencies as an equivalent of import/export dependencies of countries) and employs consumerism (e.g. in form of mandatory frequent updates). These effects of course bring all the negative implications along and lead to highly inefficient, fragile, bloated, unethical software.

Basically everyone will agree that corporate software such as Windows is to a high degree abusive to its users, be it by its spying, unjustified hardware demands, forced non customizability, price etc. A mistake a lot of people make is to think that sticking a free license to similar software will simply make it magically friendly to the user and that therefore most FOSS programs are ethical and respect its users. This is sadly not the case, a license if only the first necessary step towards freedom, but not a sufficient one -- other important steps have to follow.

A ridiculous example of capitalist software is the most consumerist type: games. AAA games are pure evil that no longer even try to be good, they just try to be addictive like drugs. Games on release aren't even supposed to work correctly, tons of bugs are the standard, something that's expected by default, customers aren't even meant to receive a finished product for their money. They aren't even meant to own the product or have any control over it (lend it to someone, install it on another computer, play it offline or play it when it gets retired). These games spy on people (via so called anti-cheat systems), are shamelessly meant to be consumed and thrown away, purposefully incompatible ("exclusives"), bloated, discriminative against low-end computers and even targeting attacks on children ("lootboxes"). Game corporations attack and take down fan modification and remakes and show all imaginable kinds of unethical behavior such as trying to steal rights for maps/mods created with the game's editor (Warcraft: Reforged).

But how can possibly a FOSS program be abusive? Let's mention a few examples:

The essential issue of capitalist software is in its goal: profit. This doesn't have to mean making money directly, profit can also mean e.g. gaining popularity and political power. This goal goes before and eventually against goals such as helping and respecting the users. A free license is a mere obstacle on the way towards this goal, an obstacle that may for a while slow down corporation from abusing the users, but which will eventually be overcome just by the sheer power of the market environment which works on the principles of Darwinian evolution: those who make most profit, by any way, survive and thrive.

Therefore "fixing" capitalist software is only possible via redefinition of the basic goal to just developing selfless software that's good for the people (as opposed to making software for profit). This approach requires eliminating or just greatly limiting capitalism itself, at least from the area of technology. We need to find other ways than profit to motivate development of software and yes, other ways do exist (morality, social status, fun etc.).


cathedral

Cathedral

Welcome to the cathedral. Here we mourn the death of technology by the hand of capitalism.

{ Sometimes we are very depressed from what's going on in this world, how technology is raped and used by living beings against each other. Seeing on a daily basis the atrocities done to the art we love and the atrocities done by it -- it is like watching a living being die. Sometimes it can help to just know you are not alone. ~drummyfish }

           R. I. P.
        ~~~~~~~~~~~~~
 
          TECHNOLOGY

      long time ago - now

  Here lies technology who was
helping people tremendously until
its last breath. It was killed by
          capitalism.

cc0

CC0

CC0 is a waiver (similar to a license) of copyright, created by Creative Commons, that can be used to dedicate one's work to the public domain (kind of).

Unlike a license, a waiver such as this removes (at least effectively) the author's copyright; by using CC0 the author willingly gives up his own copyright so that the work will no longer be owned by anyone (while a license preserves the author's copyright while granting some rights to other people). It's therefore the most free and permissive option for releasing intellectual works. CC0 is designed in a pretty sophisticated way, it also waives "neighboring rights" (moral rights), and also contains a fallback license in case waiving copyright isn't possible in a certain country. For this CC0 is one of the best ways, if not the best, of truly and completely dedicating works to public domain world-wide (well, at least in terms of copyright). In this world of extremely fucked up intellectual property laws it is not enough to state "my work is public domain" -- you need to use something like CC0 to achieve legally valid public domain status.

CC0 is recommended by LRS for both programs and other art -- however for programs additional waivers of patents should be added as CC0 doesn't deal with patents. CC0 is endorsed by the FSF but not OSI (who rejected it because it explicitly states that trademarks and patents are NOT waived).

Things Under CC0

Here are some things and places with CC0 materials that you can use in your projects so that you can release them under CC0 as well. BEWARE: if you find something under CC0, do verify it's actually valid, normies often don't know what CC0 means and happily post derivative works of proprietary stuff under CC0.

TODO


censorship

Censorship

This page is not accessible in your country... NOT :)

Censorship means intentional effort towards preventing exchange of certain kind of information among other individuals, for example suppression of free speech, altering old works of art for political reasons, forced takedowns of copyrighted material from the Internet etc. Note that thereby censorship does NOT include some kinds of data or information filtering, for example censorship does not include filtering out noise such as spam on a forum or static from audio (as noise is a non-information) or PERSONAL avoidance of certain information (e.g. using adblock or hiding someone's forum posts ONLY FOR ONESELF). Censorship often hides under euphemisms such as "moderation", "safe space", "filter", "protection", "delisting", "review" etc. Censorship is always wrong -- in a good society there is never a slightest reason to censor anything, therefore whenever censorship is deemed the best solution, something within the society is deeply fucked up. In current society censorship, along with propaganda, brainwashing and misinformation, is extremely prevalent and growing -- it's being pushed not only by governments and corporations but also by harmful terrorist groups such as LGBT and feminism who force media censorship (e.g. that of Wikipedia or search engines) and punishment of free speech (see political correctness and "hate speech").

Sometimes it is not 100% clear which action constitutes censorship: for example categorization such as moving a forum post from one thread to another (possibly less visible) thread may or may not be deemed censorship -- this depends on the intended result of such action; moving a post somewhere else doesn't remove it completely but can make it less visible. Whether something is censorship always depends on the answer to the question: "does the action prevent others from information sharing?".

Modern censorship is much more sophisticated; in old days, e.g. those of USSR pseudocommunist regimes, it was simple: stuff was reviewed and it either got censored or it passed, governments even openly admitted to censorship and stated it was simply necessary for the advancement of society. People wanted to talk but the government didn't want to let them. Not so nowadays, it got much advance in several ways:

  1. Censorship is no longer done just by the state, but by corporations, various social subgroups and even individuals as well, as so called self censorship, often automatically and subconsciously. In wanting to talk you are not just standing against one big bad guy who wants you silent, there are hundreds of sneaky bastards waiting to sue you, report you, ban you, cancel you, even physically terminate you if you touch anything controversial in one way or another.
  2. NO ONE ADMITS TO CENSORSHIP NOWADAYS, no matter how blatantly obvious their censorship is, exactly in the capitalist "deny EVERYTHING" spirit -- Wikipedia explicitly states "we are not censored" and then literally removes and blocks inclusion of legitimate information it deems "harmful". You point it out, they ban you. They will say "no, it's not censorship, it is MODERATION, PROTECTION, DELISTING, free speech has its limits, it is not a ban, it is deplatformization, blocking of hate speech is not censorship blablabla ..." -- they are inventing hundreds of new terms so that they don't have to use the word censorship.
  3. There is a lot of soft, undercover and hard to prove censorship -- no longer is something either censored or not censored, but it may be shadowbanned, hugely underanked in search, censored only to specific eyes, modified rather than deleted etc. For example Google censors thousands of websites; you WILL find those websites if Google sees you are looking specifically for those to test their censorship, but it won't ever show it to people who don't know about the site and are legitimately looking for the information they contain. Maybe they will show the site on the 100th page of the search results, which is equivalent to just blocking it completely, but they can say "haha we are not actually censoring it, gotcha". TV series and movies are silently edited retroactively in the cloud to no longer include scenes deemed politically incorrect, no one notices as no one owns physical copies anymore. In the endgame capitalists will just be constantly updating history, let's say they will just change the characters in Godfather to LGBTQ queer black women and since the movie will only be streamed from the cloud, without any old copied of the original existing, they will just say "the movie has always been like that, the author supported our politics". And so on.

There exist tools for bypassing censorship, e.g. proxies or encrypted and/or distributed, censorship-resistant networks such as Tor, Freenet, I2P or torrent file sharing. Watch out: using such tools may be illegal or at least make you look suspicious and be targeted harder by the surveillance.

Examples

TODO

See Also


chaos

Chaos

In mathematics chaos is a phenomenon that makes it extremely difficult to predict, even approximately, the result of some process even if we completely know how the process works and what state it starts in. In more technical terms chaos is a property of a nonlinear deterministic system in which even a very small change in input creates a great change in the output, i.e. the system is very sensitive to initial conditions. Chaos is a topic studied by the field called chaos theory and is important in all science. In computer science it is important for example for the generation of pseudorandom numbers or in cryptography. Every programmer should be familiar with the existence of chaotic behavior because in mathematics (programming) it emerges very often, it may pose a problem but, of course, it may be taken advantage of as well.

Perhaps the most important point is that a chaotic system is difficult to predict NOT because of randomness, lack of information about it or even its incomprehensible complexity (many chaotic systems are defined extremely simply), but because of its inherent structure that greatly amplifies any slight nudge to the system and gives any such nudge a great significance. This may be caused by things such as feedback loops and domino effects. Generally we describe this behavior as so called butterfly effect -- we liken this to the fact that a butterfly flapping its wings somewhere in a forest can trigger a sequence of events that may lead to causing a tornado in a distant city a few days later.

Examples of chaotic systems are the double pendulum, logistic map, weather (which is why it is so difficult to predict it), dice roll, rule 30 cellular automaton, logistic map, gravitational interaction of N bodies or Lorenz differential equations. Langton's ant sometimes behaves chaotically. Another example may be e.g. a billiard table with multiple balls: if we hit one of the balls with enough strength, it'll shoot and bounce off of walls and other balls, setting them into motion and so on until all balls come to stop in a specific position. If we hit the ball with exactly the same strength but from an angle differing just by 1 degree, the final position would probably end up being completely different. Despite the system being deterministic (governed by exact and predictable laws of motion, neglecting things like quantum physics) a slight difference in input causes a great different in output.

A simple example of a chaotic equation is also the function sin(1/x) for x near 0 where it oscillates so quickly that just a tiny shift along the x axis drastically changes the result. See how unpredictable results a variant of the function can give:

x 1000 * sin(10^9 / x)
4.001 455,...
4.002 818,...
4.003 -511,...
4.004 -974,...
4.005 -335,...

Logistic map is often given as the typical example of a chaotic system. It is the series defined as x[n + 1] = r * (1 - x[n]), which for some constant r (interpreted as speed of population increase) says how a population evolves from some starting value x[0]; for low x[n] the population will be increasing proportionally by the rate of r but once it reaches a higher value, it will start decreasing (as if by starvation), resulting in oscillation. Now if we only start to be interested in changing the value r and then seeing at what value the population stabilizes (for a big n), we make some interesting discoveries. This is best seen by plotting the stable values (let's say x[1000]) depending on r. For r approximately between 3.57 and 4 we start to see a chaotic behavior, with results greatly depending on the initial population value (x[0]). This demonstrates chaotic behavior.

The following is a fixed point C implementation of the above:

#include <stdio.h>
 
#define FP_UNIT 256
#define DOWNSCALE_X 4
#define DOWNSCALE_Y 25
#define LINE_LENGTH (FP_UNIT / DOWNSCALE_X)
#define GENERATIONS 1000

char stablePoints[LINE_LENGTH + 1];

int main(void)
{
  stablePoints[LINE_LENGTH] = 0; // string terminator

  for (int i = 0; i <= FP_UNIT * 4; i += DOWNSCALE_Y) // for different rs
  {
    for (int j = 0; j < LINE_LENGTH; ++j)
      stablePoints[j] = ' ';

    for (int j = 0; j < FP_UNIT; ++j) // for different starting population sizes
    {
      int population = j;

      for (int k = 0; k < GENERATIONS; ++k)
        population = (i * population * (FP_UNIT - population)) / (FP_UNIT * FP_UNIT);

      population /= DOWNSCALE_X;

      if (population >= 0 && population < LINE_LENGTH)
        stablePoints[population] = '*';
    }

    printf("%.3f| %s\n",i / ((float) FP_UNIT),stablePoints);
  }

  return 0;
} 

It outputs the following:

0.000| *                                                               
0.098| *                                                               
0.195| *                                                               
0.293| *                                                               
0.391| *                                                               
0.488| *                                                               
0.586| *                                                               
0.684| *                                                               
0.781| *                                                               
0.879| *                                                               
0.977| *                                                               
1.074| *****                                                           
1.172| **     ***                                                      
1.270| **          **                                                  
1.367| *               **                                              
1.465| *                   *                                           
1.562| *                     **                                        
1.660| *                        *                                      
1.758| *                          *                                    
1.855| *                            *                                  
1.953| *                              *                                
2.051| *                               *                               
2.148| *                                 *                             
2.246| *                                  *                            
2.344| *                                   *                           
2.441| *                                    *                          
2.539| *                                     *                         
2.637| *                                      *                        
2.734| *                                       *                       
2.832| *                                        *                      
2.930| *                                        **                     
3.027| *                                     *********                 
3.125| *                                  *       *     *              
3.223| *                               *           *      *            
3.320| *                             *                     **          
3.418| *                           **               *       **         
3.516| *                      ** * *   **           *      *****       
3.613| *                   **** *** *  * ** * *      *   ********      
3.711| *                **               **      ** *****  *     **    
3.809| *           *      **  * *        *   * *         *  *  *** *   
3.906| *     *        *     ***      *       *      *     *   * ***  * 

Vertical axis is the r parameter, i.e. the population growth speed. Horizontal axis shows stable population size after 1000 generations, starting with different initial population sizes. We can see that up until about r = 3 the stable population size always stabilizes at around the same size, which gradually increases with r. However then the line splits and after around r = 3.56 the stable population sizes are quite spread out and unpredictable, greatly depending on the initial population size. Pure CHAOS!


charity_sex

Charity Sex

TODO


cheating

Cheating

Cheating means circumventing or downright violating rules, usually while trying to keep this behavior secret. You can cheat on your partner, in games, in business etc., however despite cheating seeming like purely immoral behavior at first, it may be relatively harmless or even completely moral, e.g. in computer graphics we sometimes "cheat" our sense of sight and fake certain visual phenomena which leads to efficient rendering algorithms. In capitalism cheating is demonized and people are brainwashed to take part in cheater witch hunts.

The truth is that cheating is only an issue in a shitty society that's driven by competition. In such society there is a huge motivation for cheating (sometimes literally physical survival) as well as potentially disastrous consequences of it. Under the tyranny of capitalism we are led to worship heroes and high achievers and everyone gets pissed when we get fooled. Corporations go "OH NOES our multi bilion dollar entertainment industry is going to go bankrupt if consoomers get annoyed by cheaters! People are gonna lose their bullshit jobs! Someone is going to get money he doesn't deserve! Our customers may get butthurt!!!" (as if corporations themselves weren't basically just stealing money and raping people lol). So they start a huge brainwashing propaganda campaign, a cheater witch hunt. States do the same, communities do the same, everyone wants to stone cheaters to death but at the same time the society pressures all of us to compete to death with others or else we'll starve. We reward winners and torture the losers, then bash people who try to win -- and no, many times there is no other choice than to cheat, the top of any competition is littered with cheaters, most just don't get caught, so in about 99% of cases the only way to the top is to cheat and try to not get caught, just to have a shot at winning against others. It is proven time after time, legit looking people in the top leagues of sports, business, science and other areas are constantly being revealed as cheaters, usually by pure accident (i.e. the number of actual cheater is MANY times higher). Take a look e.g. at the Trackmania cheating scandal in which after someone invented a replay analysis tool he revealed that a great number or top level players were just cheaters, including possibly the GOAT of Trackmania Riolu (who just ragequit and never showed again lol). Of course famous cases like Neil Armstrong don't even have to be mentioned. Cheater detection systems are (and always will be) imperfect and try to minimize false positives, so only the cheaters who REPEATEDLY make MANY very OBVIOUS mistakes get caught, the smart cheaters stay and take the top places in the competitive system, just as surely as natural selection leads to the evolution of organisms that best adapt to the environment. Even if perfect cheat-detection systems existed, the problem would just shift from cheating to immoral unsportmanship, i.e. abuse of rules that's technically not cheating but effectively presents the same kind of problems. How to solve this enormously disgusting mess? We simply have to stop desperately holding to the system itself, we have to ditch it.

In a good society, such as LRS, cheating is not an issue at all, there's no motivation for it (people don't have to prove their worth by their skills, there are no money, people don't worship heroes, ...) and there are no negative consequences of cheating worse than someone ragequitting an online game -- which really isn't an issue of cheating anyway but simply a consequence of unskilled player facing a skilled one (whether the pro's skill is natural or artificial doesn't play a role, the nub will ragequit anyway). In a good society cheating can become a mild annoyance at worst, and it can really be a positive thing, it can be fun -- seeing for example a skilled pro face and potentially even beat a cheater is a very interesting thing. If someone wants to win by cheating, why not let him? Valid answers to this can only be given in the context of a shit society. In a good society choosing to cheat in a game is as if someone chooses to fly to the top of a mountain by helicopter rather than climbing it -- the choice is everyone's to make.

The fact that cheating isn't really an issue is supported by the hilariously vastly different double standards applied e.g. by chess platforms in this matter, on one hand they state in their TOS they have absolutely 0% tolerance of any kind of cheating/assistance and will lifeban players for the slightest suspicion of cheating yelling "WE HAVE TO FIGHT CHEATING", on the other hand they allow streamers literally cheat on a daily basis on live stream where everyone is seeing it, of course because streamers bring them money -- ALL top chess streamers (chessbrah, Nakamura, ...), including the world champion Magnus Carlsen himself, have videos of themselves getting advice on moves from the chat or even from high level players present during the stream, Magnus Carlsen is filmed taking over his friend's low rated account and winning a game which is the same as if the friend literally just used an engine to win the game, and Magnus is also filmed getting an advice from a top grandmaster on a critical move in a tournament that won him the game and granted him a FINANCIAL PRIZE. World chess champion is literally filmed winning money by cheating and no one cares because it was done as part of a highly lucrative stream "in a fun/friendly mood". Chessbrah streams frequently consist of many people in the room just giving advice on moves to the one who is currently playing, of course they censor all comments that try to bring up the fact that this is 100% cheating directly violating the platform's TOS. People literally have no brains, they only freak out about cheating when they're told to by the industry, when cheating is good for business people are told to shut up because it's okay and indeed they just shut up and keep consuming.


chess

Chess

Chess (from Persian shah, king) is a very old two-player board game, perhaps most famous and popular among all board games in history. It is a zero sum, complete information game with no element of randomness, that simulates a battle of two armies on an 8x8 board with different battle pieces, also called chessmen or just men. Chess is also called the King's Game, it has a world-wide competitive community and is considered an intellectual sport but it's also been a topic of research and programming (many chess engines, AIs and frontends are being actively developed). Chess is similar to games such shogi ("Japanese chess"), xiangqi ("Chinese chess") and checkers. As the estimated number of chess games is bigger than googol, it is unlikely to ever be solved; though the complexity of the game in sheer number of possibilities is astronomical, among its shogi, go and xiangqi cousins it is actually considered one of the "simplest" (the board is relatively small and the game tends to simplify as it goes on as there are no rules to get men back to the game etc.).

{ There is a nice black and white indie movie called Computer Chess about chess programmers of the 1980s, it's pretty good, very oldschool, starring real programmers and chess players, check it out. ~drummyfish }

Drummyfish has created a suckless/LRS chess library smallchesslib which includes a simple engine called smolchess (and also a small chess game in SAF with said library).

At LRS we consider chess to be one of the best games for the following reasons:

Many however see go as yet a more beautiful game: a more minimal, yet more difficult one, with a completely unique experience.

Chess as a game is not and cannot be copyrighted, but can chess games (moves played in a match) be copyrighted? Thankfully there is a pretty strong consensus and precedence that say this is not the case, even though capital worshippers try to play the intellectual property card from time to time (e.g. 2016 tournament organizers tried to stop chess websites from broadcasting the match moves under "trade secret protection", unsuccessfully).

Chess In General

Chess evolved from ancient board games in India in about 6th century. Nowadays the game is internationally governed by FIDE which has taken the on role of an authority that defines the official rules: FIDE rules are considered to be the standard chess rules. FIDE also organizes tournaments, promotes the game and keeps a list of registered players whose performance it rates with so called Elo system –⁠ based on the performance it also grants titles such as Grandmaster (GM, strongest), Internation Master (IM, second strongest) or Candidate Master (CM). A game of chess is so interesting in itself that chess is usually not played for money like many other games (poker, backgammon, ...).

The mastery of chess is often divided into two main areas (it is also common to divide strong players into these two categories depending on where their main strength lies):

Of course this is not the only possible division, another one may be for example offensive vs defensive play etc., but generally chess revolves around position and tactics.

A single game of chess is seen as consisting of three stages: opening (starting, theoretical "book" moves, developing men), middlegame (seen as the pure core of the game) and endgame (ending in which only relatively few men remain on the board). There is no clear border between these stages and they are sometimes defined differently, however each stage plays a bit differently and may require different skills and strategies; for example in the endgame king becomes an active man while in the opening and middlegame he tries to stay hidden and safe.

The study of chess openings is called opening theory or just theory. Playing the opening stage is special by being based on memorization of this theory, i.e. hundreds or even thousands of existing opening lines that have been studied and analyzed by computers, rather than by performing mental calculation (logical "thinking ahead" present in middlegame and endgame). Some see this as weakness of chess that makes players spend extreme energy on pure memorization. One of the best and most famous players, Bobby Fischer, was of this opinion and has created a chess variant with randomized starting position that prevents such memorization, so called chess 960.

Elo rating is a mathematical system of numerically rating the performance of players (it is used in many sports, not just chess). Given two players with Elo rating it is possible to compute the probability of the game's outcome (e.g. white has 70% chance of winning etc.). The FIDE set the parameters so that the rating is roughly this: < 1000: beginner, 1000-2000: intermediate, 2000-3000: master. More advanced systems have also been created, namely the Glicko system.

The rules of chess are quite simple (easy to learn, hard to master) and can be found anywhere on the Internet. In short, the game is played on a 8x8 board by two players: one with white men, one with black (LOL IT'S RACIST :D). Each man has a way of moving and capturing (eliminating) enemy men, for example bishops move diagonally while pawns move one square forward and take diagonally. The goal is to checkmate the opponent's king, i.e. make the king attacked by a man while giving him no way to escape this attack. There are also lesser known rules that noobs often miss and ignore, e.g. so called en-passant or the 50 move rule that declares a draw if there has been no significant move for 50 moves.

At the competitive level clock (so called time control) is used to give each player a limited time for making moves: with unlimited move time games would be painfully long and more a test of patience than skill. Clock can also nicely help balance unequal opponent by giving the stronger player less time to move. Based on the amount of time to move there exist several formats, most notably correspondence (slowest, days for a move), classical (slow, hours per game), rapid (faster, tens of minutes per game), blitz (fast, a few seconds per move) and bullet (fastest, units of seconds per move).

Currently the best player in the world is pretty clearly Magnus Carlsen from Norway with Elo rating 2800+.

During covid chess has experienced a small boom among normies and YouTube chess channels have gained considerable popularity. This gave rise to memes such as the bong cloud opening popularized by a top player and streamer Hikaru Nakamura; the bong cloud is an intentionally shitty opening that's supposed to taunt the opponent (it's been even played in serious tournaments lol).

White is generally seen as having a slight advantage over black (just like in real life lol) because he always has the first move. This doesn't play such as big role in beginner and intermediate games but starts to become apparent in master games. How big the advantages is is a matter of ongoing debate, most people are of the opinion there exists a slight advantage, some people think chess is a win for white with perfect play while others believe chess is a draw with perfect play. Probably only very tiny minority of people think white doesn't have any advantage.

On perfect play: as stated, chess is unlikely to be ever solved so it is unknown if chess is a theoretical forced draw or forced win for white (or even win for black), however many simplified endgames and some simpler chess variants have already been solved. Even if chess was ever solved, it is important to realize one thing: perfect play may be unsuitable for humans and so even if chess was ever solved, it might have no significant effect on the game played by humans. Imagine the following: we have a chess position in which we are deciding between move A and move B. We know that playing A leads to a very good position in which white has great advantage and easy play (many obvious good moves), however if black plays perfectly he can secure a draw here. We also know that if we play B and then play perfectly for the next 100 moves, we will win with mathematical certainty, but if we make just one incorrect move during those 100 moves, we will get to a decisively losing position. While computer will play move B here because it is sure it can play perfectly, it is probably better to play A for human because human is very likely to make mistakes (even a master). For this reason humans may willingly choose to play mathematically worse moves -- it is because a slightly worse move may lead to a safer and more comfortable play for a human.

Chess And Computers

{ This is an absolutely amazing video about weird chess algorithms :) ~drummyfish }

Chess is a big topic in computer science and programming, computers not only help people play chess, train their skills, analyze positions and perform research of games, but they also allow mathematical analysis of chess and provide a platform for things such as artificial intelligence.

Chess software is usually separated to libraries, chess engines and frontends. Chess engine is typically a CLI program capable of playing chess but also doing other things such as evaluating arbitrary position, hinting best moves, saving and loading games etc. -- commonly the engine has some kind of custom CLI interface (flags, interactive commands it understands, ...) plus a support of some standardized text communication protocol, most notably XBoard (older one, more KISS) and UCI (newer, more bloated). There is also typically support for standardized formats such as FEN (way of encoding a chess position as a text string), PGN (way of encoding games as text strings) etc. Frontends on the other hand are usually GUI programs (in this case also called boards) that help people interact with the underlying engine, however there may also be similar non-GUI programs of this type, e.g. those that automatically run tournaments of multiple engines.

Computers have already surpassed the best humans in their playing strength (we can't exactly compute an engine's Elo as it depends on hardware used, but generally the strongest would rate high above 3000 FIDE). As of 2023 the strongest chess engine is widely agreed to be the FOSS engine Stockfish, with other strong engines being e.g. Leela Chess Zero (also FOSS), AlphaZero (proprietary by Google) or Komodo Dragon (proprietary). GNU Chess is a pretty strong free software engine by GNU. There are world championships for chess engines such as the Top Chess Engine Championship or World Computer Chess Championship. CCRL is a list of chess engines along with their Elo ratings deduced from tournaments they run. Despite the immense strength of modern engines, there are still some specific artificial situations in which a human beats the computer (shown e.g. in this video); this probably won't last long though.

The first chess computer that beat the world champion (at the time Gary Kasparov) was famously Deep Blue in 1997. Alan Turing himself has written a chess playing algorithm but at his time there were no computers to run it, so he executed it by hand -- nowadays the algorithm has been implemented on computers (there are bots playing this algorithm e.g. on lichess).

For online chess there exist many servers such as https://chess.com or https://chess24.com, but for us the most important is https://lichess.org which is gratis and uses FOSS (it also allows users to run bots under special accounts which is an amazing way of testing engines against people and other engines). These servers rate players with Elo/Glicko, allow them to play with each other or against computer, solve puzzles, analyze games, play chess variants, explore opening databases etc.

Playing strength is not the only possible measure of chess engine quality, of course -- for example there are people who try to make the smallest chess programs (see countercomplex and golfing). As of 2022 the leading programmer of smallest chess programs seems to be Óscar Toledo G. (https://nanochess.org/chess.html). Unfortunately his programs are proprietary, even though their source code is public. The programs include Toledo Atomchess (392 x86 instructions), Toledo Nanochess (world's smallest C chess program, 1257 non-blank C characters) and Toledo Javascript chess (world's smallest Javascript chess program). He won the IOCCC. Another small chess program is micro-Max by H. G. Muller (https://home.hccnet.nl/h.g.muller/max-src2.html, 1433 C characters, Toledo claims it is weaker than his program). Other engines try to be strong while imitating human play (making human moves, even mistakes), most notably Maia which trains several neural networks that play like different rated human players.

{ Nanochess is actually pretty strong, in my testing it easily beat smallchesslib Q_Q ~drummyfish }

Programming Chess

NOTE: our smallchesslib/smolchess engine is very simple, educational and can hopefully serve you as a nice study tool to start with :)

There is also a great online wiki focused on programming chess engines: https://www.chessprogramming.org.

Programming chess is a fun and enriching experience and is therefore recommended as a good exercise. There is nothing more satisfying than writing a custom chess engine and then watching it play on its own.

The core of chess programming is writing the AI. Everything else, i.e. implementing the rules, communication protocols etc., is pretty straightforward (but still a good programming exercise). Nevertheless, as the chess programming wiki stresses, one has to pay a great attention to eliminating as many bugs as possible; really, the importance of writing automatic tests can't be stressed enough as debugging the AI will be hard enough and can become unmanageable with small bugs creeping in. Though has to go into choosing right data structures so as to allow nice optimizations, for example board representation plays an important role (two main approaches are a 64x64 2D array holding each square's piece vs keeping a list of pieces, each one recording its position).

The AI itself works traditionally on the following principle: firstly we implement so called static evaluation function -- a function that takes a chess position and outputs its evaluation number which says how good the position is for white vs black (positive number favoring white, negative black, zero meaning equal, units usually being in pawns, i.e. for example -3.5 means black has an advantage equivalent to having extra 3 and a half pawns; to avoid fractions we sometimes use centipawns, i.e. rather -350). This function considers a number of factors such as total material of both players, pawn structure, king safety, men mobility and so on. Traditionally this function has been hand-written, nowadays it is being replaced by a learned neural network (NNUE) which showed to give superior results (e.g. Stockfish still offers both options); for starters you probably want to write a simple evaluation function manually.

Note: if you could make a perfect evaluation function that would completely accurately state given position's true evaluation (considering all possible combinations of moves until the end of game), you'd basically be done right there as your AI could just always make a move that would lead to position which your evaluation function rated best, which would lead to perfect play. Though neural networks got a lot closer to this ideal than we once were, as far as we can foresee ANY evaluation function will always be just an approximation, an estimation, heuristic, many times far away from perfect evaluation, so we cannot stop at this. We have to program yet something more.

So secondly we need to implement a so called search algorithm -- typically some modification of minimax algorithm, e.g. with alpha-beta pruning -- that recursively searches the game tree and looks for a move that will lead to the best result in the future, i.e. to position for which the evaluation function gives the best value. This basic principle, especially the search part, can get very complex as there are many possible weaknesses and optimizations. Note now that this search kind of improves on the basic static evaluation function by making it dynamic and so increases its accuracy greatly (of course for the price of CPU time spent on searching).

Exhaustively searching the tree to great depths is not possible even with most powerful hardware due to astronomical numbers of possible move combinations, so the engine has to limit the depth quite greatly and use various hacks, approximations, heuristics etc.. Normally it will search all moves to a small depth (e.g. 2 or 3 half moves or plys) and then extend the search for interesting moves such as exchanges or checks. Maybe the greatest danger of searching algorithms is so called horizon effect which has to be addressed somehow (e.g. by detecting quiet positions, so called quiescence). If not addressed, the horizon effect will make an engine misevaluate certain moves by stopping the evaluation at certain depth even if the played out situation would continue and lead to a vastly different result (imagine e.g. a queen taking a pawn which is guarded by another pawn; if the engine stops evaluating after the pawn take, it will think it's a won pawn, when in fact it's a lost queen). There are also many techniques for reducing the number of searched tree nodes and speeding up the search, for example pruning methods such as alpha-beta (which subsequently works best with correctly ordering moves to search), or transposition tables (remembering already evaluated position so that they don't have to be evaluated again when encountered by a different path in the tree).

Alternative approaches: most engines work as described above (search plus evaluation function) with some minor or bigger modifications. The simplest possible stupid AI can just make random moves, which will of course be an extremely weak opponent -- one might perhaps try to just program a few simple rules to make it a bit less stupid and possibly a simple training opponent for complete beginners: the AI may for example pick a few "good looking" candidate moves that are "usually OK" (pushing a pawn, taking a higher value piece, castling, ...) and aren't a complete insanity, then pick one at random only from those (this randomness can further be improved and gradually controlled by scoring the moves somehow and adding a more or less random value from some range to each score, then picking the moves with highest score). One could also try to just program in a few generic rules such as: checkmate if you can, otherwise take an unprotected piece, otherwise protect your own unprotected piece etc. -- this could produce some beginner level bot. Another idea might be a "Chinese room" bot that doesn't really understand chess but has a huge database of games (which it may even be fetching from some Internet database) and then just looking up what moves good players make in positions that arise on the board, however a database of all positions will never exist, so in case the position is not found there has to be some fallback (e.g. play random move, or somehow find the "most similar position" and use that, ...). As another approach one may try to use some non neural network machine learnening, for example genetic programming, to train the evaluation function, which will then be used in the tree search. Another idea that's being tried (e.g. in the Maia engine) is pure neural net AI (or another form of machine learning) which doesn't use any tree search -- not using search at all has long been thought to be impossible as analyzing a chess position completely statically without any "looking ahead" is extremely difficult, however new neural networks have shown to be extremely good at this kind of thing and pure NN AIs can now play on a master level (a human grandmaster playing ultra bullet is also just a no-calculation, pure pattern recognition play). Next, Monte Carlo tree search (MCTS) is an alternative way of searching the game tree which may even work without any evaluation function: in it one makes many random playouts (complete games until the end making only random moves) for each checked move and based on the number of wins/losses/draws in those playouts statistically a value is assigned to the move -- the idea is that a move that most often leads to a win is likely the best. Another Monte Carlo approach may just make random playouts, stop at random depth and then use normal static evaluation function (horizon effect is a danger but hopefully its significance should get minimized in the averaging). However MCTS is pretty tricky to do well. MCTS is used e.g. in Komodo Dragon, the engine that's currently among the best. Another approach may lie in somehow using several methods and heuristics to vote on which move would be best.

Many other aspects come into the AI design such as opening books (databases of best opening moves), endgame tablebases (precomputed databases of winning moves in simple endgames), clock management, pondering (thinking on opponent's move), learning from played games etc. For details see the above linked chess programming wiki.

Notable Chess Engines/Computers/Entities

TODO: table

Stats And Records

Chess stats are pretty interesting.

{ Some chess world records are here: https://timkr.home.xs4all.nl/records/records.htm. ~drummyfish }

Number of possible games is not known exactly, Shannon estimated it at 10^120 (lower bound, known as Shannon number). Number of possible games by plies played is 20 after 1, 400 after 2, 8902 after 3, 197281 after 4, 4865609 after 5, and 2015099950053364471960 after 15.

Similarly the number of possibly reachable positions (position for which so called proof game exists) is not known exactly, it is estimated to at least 10^40 and 10^50 at most. Numbers of possible positions by plies is 20 after 1, 400 after 2, 5362 after 3, 72078 after 4, 822518 after 5, and 726155461002 after 11.

Shortest possible checkmate is by black on ply number 4 (so called fool's mate). As of 2022 the longest known forced checkmate is in 549 moves -- it has been discovered when computing the Lomonosov Tablebases.

Average game of chess lasts 40 moves. Average branching factor (number of possible moves at a time) is around 33. Maximum number of possible moves in a position seems to be 218 (FEN: R6R/3Q4/1Q4Q1/4Q3/2Q4Q/Q4Q2/pp1Q4/kBNN1KB1 w - - 0 1).

White wins about 38% of games, black wins about 34%, the remaining 28% are draws (38.7%, 31.1%, 30.3% respectively on Computer Chess Rating Lists).

What is the longest possible game? It depends on the exact rules and details we set, for example if a 50 move rule applies, a player MAY claim a draw but also doesn't have to -- but if neither player ever claims a draw, a game can be played infinitely -- so we have to address details such as this. Nevertheless the longest possible chess game upon certain rules has been computed by Tom7 at 17697 half moves in a paper for SIGBOVIK 2020. Chess programming wiki states 11798 half moves as the maximum length of a chess game which considers a 50 move rule (1966 publication).

The longest game played in practice is considered to be the one between Nikolic and Arsovic from 1989, a draw with 269 moves lasting over 20 hours. For a shortest game there have been ones with zero moves; serious decisive shortest game has occurred multiple times like this: 1.d4 Nf6 2.Bg5 c6 3.e3 Qa5+ (white resigned).

Best player ever: a 2017 paper called Who is the Master? analyzed 20 of the top players of history based on how good their moves were compared to Stockfish, the strongest engine. The resulting top 10 is (from best): Carlsen, Kramnik, Fischer, Kasparov, Anand, Khalifman, Smyslov, Petrosian, Karpov, Kasimdzhanov. It also confirmed that the quality of chess play at top level has been greatly increasing.

What's the most typical game? We can try to construct such a game from a game database by always picking the most common move in given position. Using the lichess database at the time of writing, we get the following incomplete game (the remainder of the game is split between four games, 2 won by white, 1 by black, 1 drawn):

1. e4 e5 2. Nf3 Nc6 3. Bc4 Bc5 4. c3 Nf6 5. d4 exd4
6. cxd4 Bb4+ 7. Nc3 Nxe4 8. O-O Bxc3 9. d5 Bf6 10. Re1 Ne7
11. Rxe4 d6 12. Bg5 Bxg5 13. Nxg5 h6 14. Qe2 hxg5
15. Re1 Be6 16. dxe6 f6 17. Re3 c6 18. Rh3 Rxh3
19. gxh3 g6 20. Qf3 Qa5 21. Rd1 Qf5 22. Qb3 O-O-O
23. Qa3 Qc5 24. Qb3 d5 25. Bf1

You can try to derive your own stats, there are huge free game databases such as the Lichess CC0 database of billions of games from their server.

{ TODO: Derive stats about the best move, i.e. for example "best move is usually by queen by three squares" or something like that. Could this actually help the play somehow? Maybe could be used for move ordering in alpha-beta. ~drummyfish }

Rules

The exact rules of chess and their scope may depend on situation, this is just a sum up of rules generally used nowadays.

The start setup of a chessboard is following (lowercase letters are for black men, uppercase for white men, on a board with colored squares A1 is black):

        _______________
    /8 |r n b q k b n r|
 r | 7 |p p p p p p p p|
 a | 6 |. . . . . . . .|
 n | 5 |. . . . . . . .|
 k | 4 |. . . . . . . .|
 s | 3 |. . . . . . . .|
   | 2 |P P P P P P P P|
    \1 |R N B Q K B N R|
        """""""""""""""
        A B C D E F G H
        \_____________/
             files

Players take turns in making moves, white always starts. A move consists of moving one (or in special cases two) of own men from one square to another, possibly capturing (removing from the board) one opponent's man -- except for a special en passant move capturing always happens by moving one man to the square occupied by the opposite color man (which gets removed). Of course no man can move to a square occupied by another man of the same color. A move can NOT be skipped. A player wins by giving a checkmate to the opponent (making his king unable to escape attack) or if the opponent resigns. If a player is to move but has no valid moves, the game is a draw, so called stalemate. If neither player has enough men to give a checkmate, the game is a draw, so called dead position. There are additional situation in which game can be drawn (threefold repetition of position, 50 move rule). Players can also agree to a draw. A player may also be declared a loser if he cheated, if he lost on time in a game with clock etc.

The individual men and their movement rules are (no man can move beyond another, except for knight who jumps over other men):

man symbol ~value movement comment
pawn P 1 1F, may also 2F from start, captures 1F1L or 1F1R, also en passant promotes on last row
knight N 3 L-shape (2U1L, 2U1R, 2R1U, 2R1D, 2D1R, 2D1L, 2L1U, 2L1D), jumps over
bishop B 3.25 any distance diagonally stays on same color sq.
rook R 5 any distance orthogonally (U, R, D or L) can reach all sq.
queen Q 9 like both bishop and rook strongest piece
king K inf any of 8 neighboring squares

Check: If the player's king is attacked, i.e. it is immediately possible for an enemy men to capture the king, the player is said to be in check. A player in check has to make such a move as to not be in check after that move.

A player cannot make a move that would leave him in check!

Castling: If a player hasn't castled yet and his king hasn't been moved yet and his kingside (queenside) rook hasn't been moved yet and there are no men between the king and the kingside (queenside) and the king isn't and wouldn't be in check on his square or any square he will pass through or land on during castling, short (long) castling can be performed. In short (long) castling the king moves two squares towards the kingside (queenside) rook and the rook jumps over the king to the square immediately on the other side of the king.

Promotion: If a pawn reaches the 1st or 8th rank, it is promoted, i.e. it has to be switched for either queen, rook, bishop or knight of the same color.

Checkmate: If a player is in check but cannot make any move to get out of it, he is checkmated and lost.

En passant: If a pawn moves 2 squares forward (from the start position), in the immediate next move the opponent can take it with a pawn in the same way as if it only moved 1 square forward (the only case in which a men captures another man by landing on an empty square).

Threefold repetition is a rule allowing a player to claim a draw if the same position (men positions, player's turn, castling rights, en passant state) occurs three times (not necessarily consecutively). The 50 move rule allows a player to claim a draw if no pawn has moved and no man has been captured in last 50 moves (both players making their move counts as a single move here).

Variants

Besides similar games such as shogi there are many variants of chess, i.e. slight modifications of rules, foremost worth mentioning is for example chess 960. The following is a list of some variants:

Playing Tips

Some general tips and rules of thumb, mostly for beginners:

How To Disrespect Your Opponent And Other Lulz In Chess

see also unsportmanship

WORK IN PROGRESS, pls send me more tips :)

LRS Chess

Chess is only mildly bloated but what if we try to unbloat it completely? Here we propose the LRS version of chess. The rule changes against normal chess are:

See Also


chinese

Chinese

Chinese is one of the most bloated natural human languages, spoken in China.

Any text in chinese basically looks like this:

#########################
#########################
#########################
#########################

cloud

"Cloud Computing"

Cloud is just someone else's computer.

Cloud computing, more accurately known as clown computing, means giving up an autonomous computer by storing one's data as well as running one's programs on someone else's (often a corporation's) computer, known as the cloud, through the Internet, becoming wholly dependent on someone else to which one gives all the power. While the general idea of server computers and remote terminals is not bad in itself and may be utilized in very good ways, the term cloud computing stands for abusing this idea e.g. by capitalists or states to take away autonomous computers from the people as well as to restrict freedoms of people in other ways, for example by pushing DRM, making it impossible to truly own a copy of software or other data, to run computations privately, isolated from the Internet or run non-approved, user-respecting software. Moreover clown computing as applied nowadays is mostly a very bad engineering approach that wastes bandwidth, introduces lag, requires complex and expensive infrastructure etc.

Despite all this "cloud" is the mainstream nowadays, it is the way of computing among normies, even despite regular leaks and losses of their personal data etc., simply because they're constantly being pushed to it by the big tech (Apple, Google, Micro$ost, ...) -- many times they don't even have a choice, they are simply supposed to SHUT UP AND CONSUME. And of course they wouldn't even have an idea about what's going on in the first place, all that matters to a normie is "comfort", "everyone does it", "I just need my TikTok" etc. Zoomers probably aren't even aware of the cloud, they simply have phones with apps that show their photos if Apple approves of it, they don't even care how shit works anymore.

In the future non-cloud computers will most likely become illegal. This will be justified by autonomous computers being "dangerous", only needed by terrorists, pirates and pedophiles. An autonomous computer will be seen as a gun, the right to own it will be greatly limited.


c

C

{ We have a C tutorial! ~drummyfish }

C is an old low level structured statically typed imperative compiled programming language, the language that's currently mostly used by less retarded software. Though by very strict standards it would be considered bloated, compared to any mainstream modern language it is very bullshitless and KISS, so it is also the go-to language of the suckless community as well as most true experts, for example the Linux and OpenBSD developers, because of its good, relatively simple design, uncontested performance, wide support, great number of compilers, level of control and a greatly established and tested status. C is perhaps the most important language in history, it influenced, to smaller or greater degree, basically all of the widely used languages today such as C++, Java, JavaScript etc., however it is not a thing of the past -- in the area of low level programming C is still the number one unsurpassed language. C is by no means perfect but it is currently probably the best choice of a programming language.

{ Look up The Ten Commandments for C Programmers by Henry Spencer. ~drummyfish }

It is usually not considered an easy language to learn because of its low level nature: it requires good understanding of how a computer actually works and doesn't prevent the programmer from shooting himself in the foot. Programmer is given full control (and therefore responsibility). There are things considered "tricky" which one must be aware of, such as undefined behavior of certain operators and raw pointers. This is what can discourage a lot of modern "coding monkeys" from choosing C, but it's also what inevitably allows such great performance -- undefined behavior allows the compiler to choose the most efficient implementation. On the other hand, C as a language is pretty simple without modern bullshit concepts such as OOP, it is not as much hard to learn but rather hard to master, as any other true art.

C is said to be a "portable assembly" because of its low level nature, great performance etc. -- though C is structured (has control structures such as branches and loops) and can be used in a relatively high level manner, it is also possible to write assembly-like code that operates directly with bytes in memory through pointers without many safety mechanisms, so C is often used for writing things like hardware drivers. On the other hand some restrain from likening C to assembly because C compilers still perform many transformations of the code and what you write is not necessarily always what you get.

Mainstream consensus acknowledges that C is among the best languages for writing low level code and code that requires performance, such as operating systems, drivers or games. Even scientific libraries with normie-language interfaces -- e.g. various machine learning Python libraries -- usually have the performance critical core written in C. Normies will tell you that for things outside this scope C is not a good language, with which we disagree -- we recommend using C for basically everything that's supposed to last, i.e. if you want to write a good website, you should write it in C etc.

History and Context

C was developed in 1972 at Bell Labs alongside the Unix operating system by Dennis Ritchie and Brian Kerninghan, as a successor to the B language (portable language with recursion) written by Denis Ritchie and Ken Thompson, which was in turn inspired by the the ALGOL language (code blocks, lexical scope, ...).

In 1973 Unix was rewritten in C. In 1978 Keninghan and Ritchie published a book called The C Programming Language, known as K&R, which became something akin the C specification. In 1989, the ANSI C standard, also known as C89, was released by the American ANSI. The same standard was also adopted a year later by the international ISO, so C90 refers to the same language. In 1999 ISO issues a new standard that's known as C99.

TODO

Standards

C is not a single language, there have been a few standards over the years since its inception in 1970s. The notable standards and versions are:

LRS should use C99 or C89 as the newer versions are considered bloat and don't have such great support in compilers, making them less portable and therefore less free.

The standards of C99 and older are considered pretty future-proof and using them will help your program be future-proof as well. This is to a high degree due to C having been established and tested better than any other language; it is one of the oldest languages and a majority of the most essential software is written in C, C compiler is one of the very first things a new hardware platform needs to implement, so C compilers will always be around, at least for historical reasons. C has also been very well designed in a relatively minimal fashion, before the advent of modern feature-creep and and bullshit such as OOP which cripples almost all "modern" languages.

Compilers

Standard Library

Besides the pure C language the C standard specifies a set of libraries that have to come with a standard-compliant C implementation -- so called standard library. This includes e.g. the stdio library for performing standard input/output (reading/writing to/from screen/files) or the math library for mathematical functions. It is usually relatively okay to use these libraries as they are required by the standard to exist so the dependency they create is not as dangerous, however many C implementations aren't completely compliant with the standard and may come without the standard library. So for sake of portability it is best if you can avoid using standard library.

The standard library (libc) is a subject of live debate because while its interface and behavior are given by the C standard, its implementation is a matter of each compiler; since the standard library is so commonly used, we should take great care in assuring it's extremely well written. As you probably guessed, the popular implementations (glibc et al) are bloat. Better alternatives thankfully exist, such as:

Bad Things About C

Nothing is perfect, not even C; it was one of the first relatively higher level languages and even though it has showed to have been designed extremely well, some things didn't age great, or were simply bad from the start. We still prefer this language as usually the best choice, but it's good to be aware of its downsides or smaller issues, if only for the sake of one day designing a better language. Keep in mind all here are just suggestions, they made of course be a subject to counter arguments and further discussion. So, let's go:

Basics

This is a quick overview, for a more in depth tutorial see C tutorial.

A simple program in C that writes "welcome to C" looks like this:

#include <stdio.h> // standard I/O library

int main(void)
{
  // this is the main program
    
  puts("welcome to C");

  return 0; // end with success
}

You can simply paste this code into a file which you name e.g. program.c, then you can compile the program from command line like this:

gcc -o program program.c

Then if you run the program from command line (./program on Unix like systems) you should see the message.

Cheatsheet

It's pretty important you learn C, so here's a little cheat sheet for you.

data types (just some):

branching aka if-then-else:

if (CONDITION)
{
  // do something here
}
else // optional
{
  // do something else here
}

for loop (repeat given number of times):

for (int i = 0; i < MAX; ++i)
{
  // do something here, you can use i
}

while loop (repeat while CONDITION holds):

while (CONDITION)
{
  // do something here
}

do while loop (same as while but CONDITION at the end):

do
{
  // do something here
} while (CONDITION);

function definition:

RETURN_TYPE myFunction (TYPE1 param1, TYPE2 param2, ...)
{ // return type can be void
  // do something here
}

See Also


coc

Code of Conduct

See also http://techrights.org/2019/04/23/code-of-coercion/.

Code of conduct (COC), also code of coercion, is a shitty invention of SJW fascists that's put up in projects (e.g. software) and which declares how developers of a specific project must behave socially (typically NOT just withing the context of the development but also outside of it), generally pushing toxic woke concepts such as forced inclusivity, exclusivity of people with unapproved political opinions or use of politically correct language (newspeak). Sometimes a toxic COC hides under a different name such as social contract or mission statement, though not necessarily. COC is typically placed in the project repository as a CODE_OF_CONDUCT file. In practice COCs are used to establish dictatorship and allow things such as kicking people out of development because of their political opinions expressed anywhere, inside or outside the project, and to push political opinions through software projects.

LRS must never include any COC, with possible exceptions of anti-COC (such as NO COC) or parody style COCs, not because we dislike genuine inclusivity, but because we believe COCs are bullshit and mostly harmful as they support bullying, censorship and exclusion of people.

Anyway it's best to avoid any kind of COC file in the repository, it just takes up space and doesn't serve anything. We may simply ignore this shitty concept completely. You may argue why we don't ignore e.g. copyright in the same way and just not use any licenses? The situation with copyright is different: it exists by default, without a license file the code is proprietary and our neighbors don't have the legal safety to execute basic freedoms, they may be bullied by the state -- for this we are forced to include a license file to get rid of copyright. With COC there simply isn't any such implicit issues to be solved (because COCs are simply inventing their own issues), so we just don't try to solve non-issues.


coding

Coding

Coding nowadays means low quality attempt at programming, usually practiced by soydevs and barely qualified coding monkeys.

Traditionally it means encoding and decoding of information as in e.g. video coding -- this is the only non-gay meaning of the word


collapse

Collapse

Collapse of our civilization is a concerning scenario in which basic structures of society relatively rapidly fall apart and cause unusually large, possibly world-wide horrors such as chaos, wars, famine and loss of advanced technology. It is something that will very likely happen very soon due to uncontrolled growth and societal decline by capitalism: we, the LRS, are especially focusing on a very probable technological collapse (caused by badly designed technology as well as its wrong application and extreme overuse causing dangerous dependencies) but of course clues point to collapse are coming from many directions (ecological, economical, political, natural disasters such as a coronal mass ejection etc.). Some have said that a society can deal with one crisis, but if multiple crises hit at once this hit may be fatal; however the dependence of current society on computer technology is so great that its collapse could be enough to deliver a fatal blow alone. Recently (around 2015) there has even appeared a specific term collapsology referring to the study of the potential collapse.

There is a reddit community for discussing the collapse at https://reddit.net/r/collapse. WikiWikiWeb has a related discussion under ExtinctionOfHumanity.

Collapse of civilizations has been a repeated theme throughout history, it is nothing new or exceptional, see e.g. Maya empire collapse, Bronze age collapse, the fall of Rome etc. It usually comes when a civilization reaches high complexity and becomes "spoiled", morally corrupt and socially divided, which may also be "helped" by technological advancement (e.g. the Bronze age collapse is speculated to have been partially caused by the new technology of iron which has broken the old established structures) -- basically just what we are seeing today. Economic interdependence is especially dangerous, and since we are currently living under extreme form of capitalism, we are extremely subjected to this threat: everyone is hyperspecialized and so practically no one is self sufficient, people don't know how to make food, build homes, make tools, factories are unable to produce anything without dozens of other companies providing material, technology and services for them; everything depends on highly complex and extremely fragile production chains. Besides dependence on economy, an equal or even greater danger may be our absolute dependence on computer technology: nothing works without computers and Internet, even that which could and should: factories, traffic, governments, hospitals, even many basic tools ("smart" ones, "tools as a service", ...). It is extremely likely we'll sooner or later sustain a blow that will paralyze the Internet and computers, be it a natural disaster such as coronal mass ejection, an economic disaster (supply chains collapsing, ...), political disaster (war, cyber attacks, ...), unintentional or intentional "accident" ("oops, I just turned off all computers in the world that are running Windows" --Microsoft employee), simple unsustainability of maintenance of the exponentially growing complexity of computers or anything similar (AI taking over the internet? :]). Thinking about it deeper, it seems like a miracle we are still here.

In technological world a lot of people are concerned with the collapse, notable the collapse OS, an operating system meant to run on simple hardware after the technological supply chain collapses and renders development of modern computers impossible. They believe the collapse will happen before 2030. The chip shortage, financial, climatic and energetic crisis and beginning of war in Europe as of early 2020s are one of the first warnings showing how fragile the systems really is.

Ted Kaczynski (a famous primitivist mathematician that committed mass murderer to warn about the decline of society due to complex technology) has seen the collapse as a possible option. Internet bloggers/vloggers such as Luke Smith and no phone man advocate (and practice) simple, independent off-grid living, possibly to be prepared for such an event. Even proprietary normies like Jonathan Blow warn of a coming disaster (in his talk Preventing the Collapse of Civilization). Viznut is another programmer warning about the collapse.

The details of the collapse cannot of course be predicted exactly -- it may come in a relatively quick, violent form (e.g. in case of a disaster causing a blackout) or as a more agonizing slow death. CollapseOS site talks about two stages of the slow collapse: the first one after the collapse of the supply chain. i.e. when the production of modern computers halts, and the second (decades after) when the last modern computer stops working. It most likely won't happen overnight -- that's a very extreme case. A typical collapse may take decades during which all aspects of society see a rapid decline. Of course, a collapse doesn't mean extinction of humans either, just deaths of many and great losses of what has been achieved culturally and technologically.

{ I've read a book called Blackout by Marc Elsberg whose story revolves around a fictional large collapse of power supply in Europe. A book called The World Without Us explores what the world would look like if humans suddenly disappeared. Also the podcast called Fall of Civilizations by Paul Cooper is awesome. ~drummyfish }

Live Documenting The Current Collapse Of Our Civilization

This section will record the current in-progress collapse of our civilization as seen by me, drummyfish. It may serve later generations and historians, as I know any information left behind is important for avoiding repeating the mistakes, though I also know you will not learn and will repeat the mistakes anyway. A man can dream I guess. If you can take away just one thing though, please be it this: FOR THE LOVE OF GOD, DO NOT EVER ALLOW COMPETITION TO BE THE BASE OF SOCIETY AGAIN.

Late 2022 Report

It seems like the collapse may have already begun. After the worldwide Covid pandemic the Russia-Ukraine war has begun with talks of nuclear war already going on. A great economic crisis has begun, possibly as a result of the pandemic and the war, inflation is skyrocketing and breaking all records, especially gas and energy prices are growing to extremes and as a result basically prices of everything go up as well. Russia isolated itself, new cold war has begun. Many big banks have gone bankrupt. War immigrants from Ukraine are flooding into Europe and European fascists/nationalists seem to be losing their patience about it. People in European first world countries are now actually concerned about how not to freeze during the winter, this talk is all over TV and radio. The climate disaster has also started to show, e.g. in Czech Republic there was the greatest forest fire in its history as well an extremely hot summer, even tornados that destroyed some villages (tornados in this part of world are basically unheard of), winters have almost no snow unlike some two decades ago. Everything is shitty, food costs more and is of much lower quality as basically everything else, newly bought technology cannot be expected to last longer than a few months. Society is spoiled to an unimaginable level, extreme hostility, competition and aggressive commerce is everywhere, kids are addicted to cellphones and toxic social media, mental health of population rapidly deteriorates. Art such as movies and music is of extremely low quality, people hate every single new movie or video game that comes out. A neofascist party has won elections in Italy, in Czech Republic all socialist parties were eliminated from the parliament: only capitalists rule now -- all social securities are being cancelled, people are getting poorer and poorer and forced to work more and to much higher ages. Ads are everywhere and equate psychological torture. The situation now definitely seems extremely bad.

Late 2023 Report

Yep, the collapse is happening. All previously mentioned issues just deepen, though I stopped watching the news and just avoid the negative info to enjoy the last few years I have on this Earth without much stress. Russian-Ukraine war is still happily ongoing (despite all the predictions that Russia will soon run out of resources lol) AND there is a brand new war in Israel, new immigrants are gonna flood Europe, then USA is probably gonna invade the weakened countries or something. AI is currently breaking the Internet, Google became absolutely unusable which became noticeable even by normies now, it is just flooded by AI articles, news are an endless pile of AI generated nonsense. Technology is yet much worse than before, NOTHING now works without latest consumerist hardware, subscriptions and accounts, I started just buying old paper books and am thinking about abandoning computers altogether. USA culture is here like cancer, TV is littered SOLELY with political propaganda (SUPPORT MUH UKRAINE TAKE COVID VAX SUPPORT UR COUNTRYS ENCONOMMMMMMMMIE FUGHTTTTT RAACEEEEEESSSIIIIIIIIIIIIIISSSSAMMMM) taking turns with capitalist propaganda (ads, literally just people screaming BUY BUY BUY THIS FUCKING SHIT RIGHT NOW, BUY BUY BUYIT BUY IT BUUUUUUUUUY CONSOOOOOOOOOOOME THIIIIS CONSOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOMMMMMMMMMMMMMME). Watching it for 5 minutes literally makes you kill yourself. Hmm what else. NOTHING WORKS lol, you buy something, it is already broken, you pay for repair, they take money and return it unrepaired LMAO :D I am not kidding, it is literally the norm now, my bosses business is sinking due to all machines just breaking. Absolutely unqualified people now do all the jobs lol, there are literally teachers who CANNOT READ OR WRITE correctly, they just use Google to check how everything's spelled (I am NOT kidding, I have this from first hand sources). "Programmers" literally can't program, they just use AI. People are generally braindead, controlled by religions such as economy worship, productivity cult, women fascism (it has been officially declared now that women are physically stronger than man AND also smarter LMAO, every movie is obliged by law to include a scene where this is confirmed). I dunno man, this can't last much longer than a few years.

See Also


collision_detection

Collision Detection

Collision detection is an essential problem e.g. of simulating physics of mechanical bodies in physics engines (but also elsewhere), it tries to detect whether (and also how) geometric shapes overlap. Here we'll be talking about the collision detection in physics engines, but the problem appears in other contexts too (e.g. frustum culling in computer graphics). Collision detection potentially leads to so called collision resolution, a different stage that tries to deal with the detected collision (separate the bodies, update their velocities, make them "bounce off"). Physics engines are mostly divided into 2D and 3D ones so we also normally either talk about 2D or 3D collision detection (3D being, of course, a bit more complex).

There are two main types of collision detection:

Collision detection is non-trivial because we need to detect not only the presence of the collision but also its parameters which are typically the exact point of collision, collision depth and collision normal -- these are needed for subsequently resolving the collision (typically the bodies will be shifted along the normal by the collision depth to become separated and impulses will be applied at the collision point to update their velocities). We also need to detect general cases, i.e. collisions of whole volumes (imagine e.g. a tiny cuboid inside an arbitrarily rotated bigger cone). This is very hard and/or expensive for some complex shapes such as general 3D triangle meshes (which is why we approximate them with simpler shapes). We also want the detection algorithm to be at least reasonably fast -- for this reason collision detection mostly happens in two phases:

In many cases it is also important to correctly detect the order of collisions -- it may well happen a body collides not with one but with multiple bodies at the time of collision detection and the computed behavior may vary widely depending on the order in which we consider them. Imagine that body A is colliding with body B and body C at the same time; in real life A may have first collided with B and be deflected so that it would have never hit C, or the other way around, or it might have collided with both. In continuous collision detection we know the order as we also have exact time coordinate of each collision (even though the detection itself is still computed at discrete time steps), i.e. we know which one happened first. With discrete collisions we may use heuristics such as the direction in which the bodies are moving, but this may fail in certain cases (consider e.g. collisions due to rotations).

On shapes: general rule is that mathematically simpler shapes are better for collision detection. Spheres (or circles in 2D) are the best, they are stupidly simple -- a collision of two spheres is simply decided by their distance (i.e. whether the distance of their center points is less that the sum of the radia of the spheres), which also determines the collision depth, and the collision normal is always aligned with the vector pointing from one sphere center to the other. So if you can, use spheres -- it is even worth using multiple spheres to approximate more complex shapes if possible. Capsules ("extruded spheres"), infinite planes, half-planes, infinite cylinders (distance from a line) and axis-aligned boxes are also pretty simple. Cylinders and cuboids with arbitrary rotation are bit harder. Triangle meshes (the shape most commonly used for real-time 3D models) are very difficult but may be approximated e.g. by a convex hull which is manageable (a convex hull is an intersection of a number of half-spaces) -- if we really want to precisely collide full 3D meshes, we may split each one into several convex hulls (but we need to write the non-trivial splitting algorithm of course). Also note that you need to write a detection algorithm for any possible pair of shape types you want to support, so for N supported shapes you'll need N * (N + 1) / 2 detection algorithms.

{ In theory we may in some cases also think about using iterative/numerical methods to find collisions, i.e. starting at some point between the bodies and somehow stepping towards their intersection until we're close enough. Another idea I had was to use signed distance functions for representing static environments, I kind of implemented it in tinyphysicsengine. ~drummyfish }

TODO: some actual algorithms


collision

Collision

Collision, sometimes also conflict, happens when two or more things want to occupy the same spot. This situation usually needs to be addressed somehow; then we talk about collision resolution. In programming there are different types of collisions, for example:


color

Color

Color (also colour, from celare, "to cover") is the perceived visual quality of light that's associated with its wavelength/frequency (or mixture of several); for example red, blue and yellow are colors. Electromagnetic waves with wavelength from about 380 to 750 nm (about 400 to 790 THz) form the visible spectrum, i.e. waves our eyes can see -- combining such waves with different intensities and letting them fall on the retina of our eyes gives rise to the perception of color in our brain. There is a hugely deep color theory concerned with the concept of color (its definition, description, reproduction, psychological effect etc.). Needless to say colors are extremely important in anything related to visual information such as art, computer graphics, astrophysics, various visualizations or just everyday perception of our world. Color support is sometimes used as the opposite of systems that are extremely limited in the number of colors they can handle, which may be called monochromatic, 1bit (distinguishing only two colors), black&white or grayscale. Color can be seen to be in the same relation to sight as pitch is to hearing.

How many colors are there? The number of colors humans can distinguish is of course individual (color blindness makes people see fewer colors but there are also conditions that make one see more colors) but various sources state we are able to distinguish millions or even over 10 million different colors on average. In computer technology we talk about color depth which says the number of bits we use to represent color -- the more bits, the more colors we can represent. 24 bits are nowadays mostly used to record color (8 bits for each red, green and blue component, so called true color), which allows for 16777216 distinct colors, though even something like 16 bits (65536 colors) is mostly enough for many use cases. Some advanced systems however support many more colors than true color, especially extremely bright and dim ones -- see HDR.

What gives physical objects its color? Most everyday objects get its color from reflecting only specific parts of the white light (usually sunlight), while absorbing the opposite part of the spectrum, i.e. for example a white object reflects all incoming light, a black one absorbs all incoming light (that's why black things get hot in sunlight), a red one reflects the red light and absorbs the rest etc. This is determined by the qualities of the object's surface, such as the structure of its atoms or its microscopic geometry.

TODO

What Is Color?

This is actually a non-trivial question, or rather there exist many varying definitions of it and furthermore it is a matter of subjective experience, perception of colors may differ between people. When asking what color really is, consider the following:


combinatorics

Combinatorics

Combinatorics is an area of math that's basically concerned with counting possibilities. As such it is very related to probability theory (as probability is typically defined in terms of ratios of possible outcomes). It explores things such as permutations and combinations, i.e. question such as how many ways are there to order N objects or how many ways are there to choose k objects from a set of N objects.

The two basic quantities we define in combinatorics are permutations and combinations.

Permutation (in a simple form) of a set of objects (lets say A, B and C) is one possible ordering of such set (i.e. ABC, ACB, BAC etc.). I.e. here by permutation of a number n, which we'll write as P(n), we mean the number of possible orderings of a set of size n. So for example P(1) = 1 because there is only one way to order a set containing one item. Similarly P(3) = 6 because there are six ways to order a set of three objects (ABC, ACB, BAC, BCA, CAB, CBA). P(n) is computed very simply, it is factorial of n, i.e. P(n) = n!.

Combination (without repetition) of a set of objects says in how many ways we can select given number of objects from that set (e.g. if there are 4 shirts in a drawer and we want to choose 2, how many possibilities are there?). I.e. given a set of certain size a combination tells us the number of possible subsets of certain size. I.e. there are two parameters of a combination, one is the size of the set, n, and the other is the number of items (the size of the subset) we want to select from that set, k. This is written as nCk, C(n,k) or

 / n \
|     |
 \ k /

A combination is computed as C(n,k) = n! / (k! * (n - k)!). E.g. having a drawer with 4 shirts (A, B, C and D) and wanting to select 2 gives us C(4,2) = 4! / (2! * (4 - 2)!) = 6 possibilities (AB, AC, AD, BC, BD, CD).

Furthermore we can define combinations with repetitions in which we allow ourselves to select the same item from the set more than once (note that the selection order still doesn't matter). I.e. while combinations without repetition give us the number of possible subsets, a combinations WITH repetitions gives us the number of possible multisubsets of a given set. Combinations with repetition is computed as Cr(n,k) = C(n + k - 1,k). E.g. having a drawer with 4 shirts and wanting to select 2 WITH the possibility to choose one shirt multiple times gives us Cr(4,2) = C(5,2) = 5! / (2! * (5 - 2)!) = 10 possibilities (AA, AB, AC, AD, BB, BC, BD, CC, CD, DD).

Furthermore if we take combinations and say that order matters, we get generalized permutations that also take two parameters, n and k, and there are two kinds: without and with repetitions. I.e. permutations without repetitions tell us in how many ways we can choose k items from n items when ORDER MATTERS, and is computed as P(n,k) = n!/(n - k)! (e.g. P(4,2) = 4!/(4 - 2)! = 12, AB, AC, AD, BA, BC, BD, CA, CB, CD, DA, DB, DC). Permutations with repetitions tell us the same thing but we are allowed to select the same thing multiple times, it is computed as Pr(n,k) = n^k (e.g. P(4,2) = 4^2 = 16, AA, AB, AC, AD, BA, BB, BC, BD, CA, CB, CC, CD, DA, DB, DC, DD).

To sum up:

quantity order matters? repetition allowed? formula
permutation (simple) yes P(n) = n!
permutation without rep. yes no P(n,k) = n!/(n - k)!
permutation with rep. yes yes Pr(n,k) = n^k
combination without rep. no no C(n,k) = n! / (k! * (n - k)!)
combination with rep. no yes Cr(n,k) = C(n + k - 1,k)

Here is an example of applying all the measures to a three item set ABC (note that selecting nothing from a set counts as 1 possibility, NOT 0):

quantity possibilities (for set ABC) count
P(3) ABC ACB BAC BCA CAB CBA 3! = 6
P(3,0) 3!/(3 - 0)! = 1
P(3,1) A B C 3!/(3 - 1)! = 3
P(3,2) AB AC BA BC CA CB 3!/(3 - 2)! = 6
P(3,3) ABC ACB BAC BCA CAB CBA 3!/(3 - 3)! = 6
Pr(3,0) 3^0 = 1
Pr(3,1) A B C 3^1 = 3
Pr(3,2) AA AB AC BA BB BC CA CB CC 3^2 = 9
Pr(3,3) AAA AAB AAC ABA ABB ABC ACA ACB ACC ... 3^3 = 27
C(3,0) 3!/(0! * (3 - 0)!) = 1
C(3,1) A B C 3!/(1! * (3 - 1)!) = 3
C(3,2) AB AC BC 3!/(2! * (3 - 2)!) = 3
C(3,3) ABC 3!/(3! * (3 - 3)!) = 1
Cr(3,0) C(3 + 0 - 1,0) = 1
Cr(3,1) A B C C(3 + 1 - 1,1) = 3
Cr(3,2) AA AB AC BB BC CC C(3 + 2 - 1,2) = 6
Cr(3,3) AAA AAB AAC ABB ABC ACC BBB BBC BCC CCC C(3 + 3 - 1,3) = 10

comment

Comment

Comment is part of computer code that doesn't affect how the code is interpreted by the computer and is intended to hold information for humans that read the code (even though comments can sometimes contain additional information for computers such as metadata and autodocumentation information). There are comments in basically all programming languages, they usually start with //, #, /* and similar symbols, sometimes parts of code that don't fit the language syntax are ignored and as such can be used for comments.

Even though you should write nice, self documenting code, you should comment you source code as well. General tips on commenting:


communism

Communism

Communism (from communis -- common, shared) is a very wide term which most generally stands for the idea that sharing and equality should be the basic values and principles of a society; as such it is a leftist idea which falls under socialism (i.e. basically focusing on people at large). There are very many branches, theories, political ideologies and schools of thought somewhat based on communism, for example Marxism, Leninism, anarcho communism, primitive communism, Christian communism, Buddhist communism etc. -- of course, some of these are good while others are evil and only abuse the word communism as a kind of brand (as also happens e.g. with anarchism). Sadly after the disastrous failure of the violent pseudocommunist revolutions of the 20th century, most people came to equate the word communism with oppressive militant regimes, however we have to stress that communism is NOT equal to USSR, Marxism-Lenninism, Stalinism or any other form of pseudocommunism, on the contrary, such regimes were rather hierarchical, violent and pseudocommunist, often downright fascist. We ourselves embrace true communism and base our LRS and less retarded society on ideas of unconditional sharing. Yes, large communist societies have existed and worked, for example the Inca empire worked without money and provided FREE food, clothes, houses, health care, education and other products of collective work to everyone, according to his needs. Many other communities also work on more or less communist principles, see e.g. Jewish kibbutz, Sikhist langar, free software, or even just most families for that matter. The color red is usually associated with communism and the "hammer and sickle" (U+262D) is taken as its symbol, though that's mostly associated with the evil communist regimes and so its usage is to be greatly considered.

Common ideas usually associated with communism are (please keep in mind that this may differ depending on the specific flavor of communism):

TODO

See Also


competition

Competition

Competition is a situation of conflict in which several entities try to overpower or otherwise win over each other. It is the opposite of collaboration. Competition is connected to pursuing self interest.

Competition is the absolute root cause of most evil in society. Society must never be based on competition. Unfortunately our society has decided to do the exact opposite with capitalism, the glorification of competition -- this will very likely lead to the destructing of our society, possibly even to the destruction of all life.

Competition is to society what a drug is to an individual: competition makes a situation become better quickly and start achieving technological "progress" but for the price of things going downwards from then on, competition quickly degenerates and kills other values in society such as altruism and morality; society that decides to make unnaturally fast "progress" and base itself on competition is equivalent to someone deciding to take steroids to grow muscles quickly -- corporations that arise in technologically advanced society take over the world just like muscle cancer that grows from taking steroids. A little bit of competition can be helpful in small doses just as painkillers can on occasion help lower suffering of an individual, but one has to be extremely careful to not take too many of them... even smoking a joint from time to time can have a positive effect, however with capitalism our society has become someone who has started to take heroin and only live for that drug alone, take as much of it as he can. Invention of bullshit jobs just to keep competition running, extreme growing hostility of people, productivity cults, overworking, wage slavery, extreme waste that's destroying our environment, all of these are signs our society is dying from overdose, living from day to day, trying to get a few bucks for the next dose of its drug.

Is all competition bad? Competition is not bad as a concept, it may for example be used in genetic programming to evolve good computer programs. People also have a NEED for at least a bit of competition as this need was necessary to survive in the past -- this need has to be satisfied, so we create artificial, mostly harmless competition e.g. with games and sports. This kind of competition is not so bad as long as we are aware of the dangers of overapplying it. What IS bad is making competition the basis of a society, in a good society people must never compete for basic needs such as food, shelter or health care. Furthermore after sufficient technological progress, competition is no longer just a bad basis for society, it becomes a fatal one because society gains means for complete annihilation of all life such as nuclear weapons or factories poisoning our environment that in the heat of competition will sooner or later destroy the society. I.e. in a technologically advanced society it is necessary to give up competition so as to prevent own destruction.

Why is competition so prevalent if it is so bad? Because it is natural and it has been with us since we as life arised. It is extremely hard to let go of such a basic instinct but it has to be done not only because competition has become obsolete and is now only artificially sustaining suffering without bringing in any benefits (we, humans, have basically already won the evolution), but because, as has been said, sustaining competition is now fatal.

How to achieve letting go of competition in society? The only way is a voluntary choice achieved through our intellect, i.e. through education. Competition is something we naturally want to do, but we can rationally decide not to do it once we see and understand it is bad -- such behavior is already occurring, for example if we know someone is infected with a sexually transmitting disease, we rationally overcome the strong natural instinct to have sex with him.


compression

Compression

Compression means encoding data (such as images or texts) in a different way so that the data takes less space (memory) while keeping all the important information, or, in plain terms, it usually means "making files smaller". Compression is pretty important so that we can utilize memory or bandwidth well -- without it our hard drives would be able to store just a handful of videos, internet would be slow as hell due to the gigantic amount of transferred data and our RAM wouldn't suffice for things we normally do. There are many algorithms for compressing various kinds of data, differing by their complexity, performance, efficiency of compression etc. The reverse process to compression (getting the original data back from the compressed data) is called decompression. The ratio of the compressed data size to the original data size is called compression ratio (the lower, the better). The science of data compression is truly huge and complicated AF, here we'll just mention some very basics. Also watch out: compression algorithms are often a patent mine field.

{ I've now written a tiny LRS compression library/utility called shitpress, check it out at https://codeberg.org/drummyfish/shitpress. It's fewer than 200 LOC, so simple it can nicely serve educational purposes. The principle is simple, kind of a dictionary method, where the dictionary is simply the latest output 64 characters; if we find a long word that occurred recently, we simply reference it with mere 2 bytes. It works relatively well for most data! ~drummyfish }

{ There is a cool compressing competition known as Hutter Prize that offers 500000 pounds to anyone who can break the current record for compressing Wikipedia. Currently the record is at compressing 1GB down to 115MB. See http://prize.hutter1.net for more. ~drummyfish }

{ LMAO retard patents are being granted on impossible compression algorithms, see e.g. http://gailly.net/05533051.html. See also Sloot Digital Coding System, a miraculous compression algorithm that "could store a whole movie in 8 KB" lol. ~drummyfish }

Let's keep in mind compression is not applied just to files on hard drives, it can also be used e.g. in RAM to utilize it more efficiently.

Why don't we compress everything? Firstly because compressed data is slow to work with, it requires significant CPU time to compress and decompress data, it's a kind of a space-time tradeoff (we gain more storage space for the cost of CPU time). Secondly compressed data is more prone to corruption because redundant information (which can help restoring corrupted data) is removed from it -- in fact we sometimes purposefully do the opposite of compression and make our data bigger to protect it from corruption (see e.g. error correcting codes, RAID etc.). And last but not least, many data can hardly be compressed or are so small it's not even worth it.

The basic division of compression methods is to:

Furthermore we may divide compression e.g. to offline (compresses a whole file, may take long) and streaming (compressing a stream of input data on-the-go and in real-time), by the type of input data (binary, text, audio, ...), basic principle (RLE, dictionary, "AI", ...) etc.

The following is an example of how well different types of compression work for an image (screenshot of main page of Wikimedia Commons, 1280x800):

{ Though the website screenshot contained also real life photos, it still contained a lot of constant color areas which can be compressed very well, hence quite good compression ratios here. A general photo won't be compressed as much. ~drummyfish }

compression ~size (KB) ratio
none 3000 1
general lossless (lz4) 396 0.132
image lossless (PNG) 300 0.1
image lossy (JPG), nearly indistinguishable quality 164 0.054
image lossy (JPG), ugly but readable 56 0.018

Mathematically there cannot exist a lossless compression algorithm that would always reduce the size of any input data -- if it existed, we could just repeatedly apply it and compress ANY data to zero bytes. And not only that -- every lossless compression will inevitably enlarge some input files. This is also mathematically given -- we can see compression as simply mapping input binary sequences to output (compressed) binary sequences, while such mapping has to be one-to-one (bijective); it can be simply shown that if we make any such mapping that reduces the size of some input (maps a longer sequence to a shorter one, i.e. compresses it), we will also have to map some short code to a longer one. However we can make it so that our compression algorithm enlarges a file at most by 1 bit: we can say that the first bit in the compressed data says whether the following data is compressed or not; if our algorithm fails to reduce the size of the input, it simply sets the bit to says so and leaves the original file uncompressed (in practice many algorithms don't do this though as they try to work as streaming filters, without random access to data, which would be needed here).

Dude, how does compression really work tho? The basic principle of lossless compression is removing redundancy (correlations in the data), i.e. that which is explicitly stored in the original data but doesn't really have to be there because it can be reasoned out from the remaining data. This is why a completely random noise can't be compressed -- there is no correlated data in it, nothing to reason out from other parts of the data. However human language for example contains many redundancies. Imagine we are trying to compress English text and have a word such as "computer" on the input -- we can really just shorten it to "computr" and it's still pretty clear the word is meant to be "computer" as there is no other similar English word (we also see that compression algorithm is always specific to the type of data we expect on the input -- we have to know what nature of the input data we can expect). Another way to remove redundancy is to e.g. convert a string such as "HELLOHELLOHELLOHELLOHELLO" to "5xHELLO". Lossy compression on the other hand tries to decide what information is of low importance and can be dropped -- for example a lossy compression of text might discard information about case (upper vs lower case) to be able to store each character with fewer bits; an all caps text is still readable, though less comfortably.

OK, but how much can we really compress? Well, as stated above, there can never be anything such as a universal uber compression algorithm that just makes any input file super small -- everything really depends on the nature of the data we are trying to compress. The more we know about the nature of the input data, the more we can compress, so a general compression program will compress only a little, while an image-specialized compression program will compress better (but will only work with images). As an extreme example, consider that in theory we can make e.g. an algorithm that compresses one specific 100GB video to 1 bit (we just define that a bit "1" decompresses to this specific video), but it will only work for that one single video, not for video in general -- i.e. we made an extremely specialized compression and got an extremely good compression ratio, however due to such extreme specialization we can almost never use it. As said, we just cannot compress completely random data at all (as we don't know anything about the nature of such data). On the other hand data with a lot of redundancy, such as video, can be compressed A LOT. Similarly video compression algorithms used in practice work only for videos that appear in the real world which exhibit certain patterns, such as two consecutive frames being very similar -- if we try to compress e.g. static (white noise), video codecs just shit themselves trying to compress it (look up e.g. videos of confetti and see how blocky they get). All in all, some compression benchmarks can be found e.g. at https://web.archive.org/web/20110203152015/http://www.maximumcompression.com/index.html -- the following are mentioned types of data and their best measured compression ratios: English text 0.12, image (lossy) 0.76, executable 0.24.

Methods

The following is an overview of some most common compression techniques.

Lossless

RLE (run length encoding) is a simple method that stores repeated sequences just as one element of the sequence and number of repetitions, i.e. for example "abcabcabc" as "3abc".

Entropy coding is another common technique which counts the frequencies (probabilities) of symbols on the input and then assigns the shortest codes to the most frequent symbols, leaving longer codes to the less frequent. The most common such codings are Huffman coding and Arithmetic coding.

Dictionary (substitutional) methods try to construct a dictionary of relatively long symbols appearing in the input and then only store short references to these symbols. The format may for example choose to first store the dictionary and then the actual data with pointers to this dictionary, or it may just store the data in which pointers are stored to previously appearing sequences.

Predictor compression is based on making a predictor that tries to guess following data from previous values (which can be done e.g. in case of pictures, sound or text) and then only storing the difference against such a predicted result. If the predictor is good, we may only store the small amount of the errors it makes.

A famous family of dictionary compression algorithms are Lempel-Ziv (LZ) algorithms -- these two guys first proposed LZ77 in (1977, sliding window) and LZ78 (explicitly stored dictionary, 1978). These were a basis for improved/remix algorithms, most notably LZW (1984, Welch). Additionally these algorithms are used and combined in other algorithms, most notably gif and DEFLATE (used e.g. in gzip and png).

An approach similar to the predictor may be trying to find some general mathematical model of the data and then just find and store the right parameters of the model. This may for example mean vectorizing a bitmap image, i.e. finding geometrical shapes in an image composed of pixels and then only storing the parameters of the shapes -- of course this may not be 100% accurate, but again if we want to preserve the data accurately, we may additionally also store the small amount of errors the model makes. Similar approach is used in vocoders used in cellphones that try to mathematically model human speech (however here the compression is lossy), or in fractal compression of images. A nice feature we gain here is the ability to actually "increase the resolution" (or rather generate detail) of the original data -- once we fit a model onto our data, we may use it to tell us values that are not actually present in the data (i.e. we get a fancy interpolation/extrapolation).

Another property of data to exploit may be its sparsity -- if for example we have a huge image that's prevalently white, we may say white is the implicit color and we only somehow store the pixels of other colors.

Some more wild techniques may include genetic programming that tries to evolve a small program that reproduces the input data, or using "AI" in whatever way to compress the data (in fact compression is an essential part of many neural networks as it forces the network to "understand", make sense of the data -- many neural networks therefore internally compress and decompress the data so as to filter out the unimportant information).

Note that many of these methods may be combined or applied repeatedly as long as we are getting smaller results.

Furthermore also take a look at procedural generation, a related technique that allows to embed a practically infinite amount of content with only quite small amount of code.

Lossy

In lossy compression we generally try to limit information that is not very important and/or to which we aren't very sensitive, typically by dropping precision by quantization, i.e. basically lowering the number of bits we use to store the "not so important" information -- in some cases we may just drop some information altogether (decrease precision to zero). Furthermore we finally also apply lossless compression to make the result even smaller.

For images we usually exploit the fact that human sight is less sensitive to certain visual information, such as specific frequencies, colors, brightness etc. Common methods used here are:

In video compression we may reuse the ideas from image compression and further employ exploiting temporal redundancy, i.e. the fact that consecutive video frames look similar, so we may only encode some kind of delta (change) against the previous (or even next) frame. The most common way is to fully record only one key frame in some time span (so called I-frame, further compressed with image compression methods), then divide it to small blocks and estimate the movement of those blocks so that they approximately make up the following frames -- we then record only the motion vectors of the blocks. This is why videos look "blocky". In the past interlacing was also used -- only half of each frame was recorded, every other row was dropped; when playing, the frame was interlaced with the previous frame.

In audio we usually straight remove frequencies that humans can't hear (usually said to be above 20 kHz), for this we again convert the audio from spatial to frequency domain (using e.g. Fourier transform). Furthermore it is very inefficient to store sample values directly -- we rather use so called differential PCM, a lossless compression that e.g. stores each sample as a difference against the previous sample (which is usually small and doesn't use up many bits). This can be improved by a predictor, which tries to predict the next values from previous values and then we only save the difference against this prediction. Joint stereo coding exploits the fact that human hearing is not so sensitive to the direction of the sound and so e.g. instead of recording both left and right stereo channels in full quality rather records the sum of both and a ratio between them (which can get away with fewer bits). Psychoacoustics studies how humans perceive sound, for example so called masking: certain frequencies may for example mask nearby (both in frequency and time) frequencies (make them unhearable for humans) so we can drop them. See also vocoders.

TODO: LZW, DEFLATE etc.

Compression Programs/Utils/Standards

Here is a list of some common compression programs/utilities/standards/formats/etc:

util/format extensions free? media lossless? notes
bzip2 .bz2 yes general yes Burrows–Wheeler alg.
flac .flac yes audio yes super free lossless audio format
gif .gif now yes image/anim. no limited color palette, patents expired
gzexe yes executable bin. yes makes self-extracting executable
gzip .gz yes general yes by GNU, DEFLATE, LZ77, mostly used by Unices
jpeg .jpg, .jpeg yes? raster image no common lossy format, under patent fire
lz4 .lz4 yes general yes high compression/decompression speed, LZ77
mp3 .mp3 now yes audio no popular audio format, patents expired
png .png yes raster image yes popular lossless image format, transparency
rar .rar NO general yes popular among normies, PROPRIETARY
vorbis .ogg yes audio no was a free alternative to mp3, used with ogg
zip .zip yes? general yes along with encryption may be patented
7-zip .7z yes general yes more complex archiver

Code Example

Let's write a simple lossless compression utility in C. It will work on binary files and we will use the simplest RLE method, i.e. our program will just shorten continuous sequences of repeating bytes to a short sequence saying "repeat this byte N times". Note that this is very primitive (a small improvement might be actually done by looking for sequences of longer words, not just single bytes), but it somewhat works for many files and demonstrates the basics.

The compression will work like this:

Decompression is then quite simple -- we simply output what we read, unless we read the marker value; in such case we look whether the following value is 0xFF (then we output the marker value), else we know we have to repeat the next character this many times plus 4.

For example given input bytes

0x11 0x00 0x00 0xAA 0xBB 0xBB 0xBB 0xBB 0xBB 0xBB 0x10 0xF3 0x00
                    \___________________________/      \__/
                       long repeating sequence        marker!

Our algorithm will output a compressed sequence

0x11 0x00 0x00 0xAA 0xF3 0x02 0xBB 0x10 0xF3 0xFF 0x00
                    \____________/      \_______/
                    compressed seq.   encoded marker

Notice that, as stated above in the article, there inevitably exists a "danger" of actually enlarging some files. This can happen if the file contains no sequences that we can compress and at the same time there appear the marker values which actually get expanded (from 1 byte to 2).

The nice property of our algorithm is that both compression and decompression can be streaming, i.e. both can be done in a single pass as a filter, without having to load the file into memory or randomly access bytes in files. Also the memory complexity of this algorithm is constant (RAM usage will be the same for any size of the file) and time complexity is linear (i.e. the algorithm is "very fast").

Here is the actual code of this utility (it reads from stdin and outputs to stdout, a flag -x is used to set decompression mode, otherwise it is compressing):

#include <stdio.h>

#define SPECIAL_VAL 0xf3 // random value, hopefully not very common

void compress(void)
{
  unsigned char prevChar = 0;
  unsigned int  seqLen = 0;
  unsigned char end = 0;

  while (!end)
  {
    int c = getchar();

    if (c == EOF)
      end = 1;

    if (c != prevChar || c == SPECIAL_VAL || end || seqLen > 200)
    { // dump the sequence
      if (seqLen > 3)
        printf("%c%c%c",SPECIAL_VAL,seqLen - 4,prevChar);
      else
        for (int i = 0; i < seqLen; ++i)
          putchar(prevChar);

      seqLen = 0;
    }

    prevChar = c;
    seqLen++;

    if (c == SPECIAL_VAL)
    {
      // this is how we encode the special value appearing in the input
      putchar(SPECIAL_VAL);
      putchar(0xff);
      seqLen = 0;
    }
  }
}

void decompress(void)
{
  unsigned char end = 0;

  while (1)
  {
    int c = getchar();

    if (c == EOF)
      break;

    if (c == SPECIAL_VAL)
    {
      unsigned int seqLen = getchar();

      if (seqLen == 0xff)
        putchar(SPECIAL_VAL);
      else
      {
        c = getchar();

        for (int i = 0; i < seqLen + 4; ++i)
          putchar(c);
      }
    }
    else
      putchar(c);
  }
}

int main(int argc, char **argv)
{
  if (argc > 1 && argv[1][0] == '-' && argv[1][1] == 'x' && argv[1][2] == 0)
    decompress();
  else
    compress();

  return 0;
}

How well does this perform? If we try to let the utility compress its own source code, we get to 1242 bytes from the original 1344, which is not so great -- the compression ratio is only about 92% here. We can see why: the only repeating bytes in the source code are the space characters used for indentation -- this is the only thing our primitive algorithm manages to compress. However if we let the program compress its own binary version, we get much better results (at least on the computer this was tested on): the original binary has 16768 bytes while the compressed one has 5084 bytes, which is an EXCELLENT compression ratio of 30%! Yay :-)

See Also


compsci

Computer Science

Computer science, abbreviated as "compsci", is (surprise-surprise) a science studying computers. The term is pretty wide, a lot of it covers very formal and theoretical areas that neighbor and overlap with mathematics, such as formal languages, cryptography and machine learning, but also more practical/applied and "softer" disciplines such as software_engineering, programming hardware, computer networks or even user interface design. This science deals with such things as algorithms, data structures, artificial intelligence and information theory. The field has become quite popular and rapidly growing after the coming of the 21st century computer/Internet revolution and it has also become quite spoiled and abused by its sudden lucrativity.

Overview

Notable fields of computer science include:

Computer science also figures in interdisciplinary endeavors such as bioinformatics and robotics.

In the industry there have arisen fields of art and study that probably shouldn't be included in computer science itself, but are very close to it. These may include e.g. web design (well, let's include it for the sake of completeness), game design, system administration etc.


computer

Computer

The word computer can be defined in many ways and can also take many different meanings; a somewhat common definition may be this: computer is a machine that automatically performs mathematical computations. We can also see it as a machine for processing information or, very generally, as any tool that helps computation, in which case one's fingers or even a mathematical formula itself can be considered a computer. Here we are of course mostly concerned with electronic digital computers.

We can divide computers based on many attributes, e.g.:

Computers are theoretically studied by computer science. The kind of computer we normally talk about consists of two main parts:

The power of computers is limited, Alan Turing mathematically proved that there exist problems that can never be completely solved by any algorithm, i.e. there are problems a computer (including our brain) will never be able to solve (even if solution exists). This is related to the fact that the power of mathematics itself is limited in a similar way (see Godel's theorems). Turing also invented the theoretical model of a computer called the Turing machine. Besides the mentioned theoretical limitation, many solvable problems may take too long to compute, at least with computers we currently know (see computational complexity and P vs NP).

Typical Computer

Computers we normally talk about in daily conversations are electronic digital mostly personal computers such as desktops and laptops, possibly also cell phones, tablets etc.

Such a computer consists of some kind of case (chassis), internal hardware plus peripheral devices that serve for input and output -- these are for example a keyboard and mouse (input devices), a monitor (output device) or harddisk (input/output device). The internals of the computer normally include:

Notable Computers

Here is a list of notable computers.

{ Some nice list of ancient computers is here: https://xnumber.com/xnumber/frame_malbum.htm. ~drummyfish }

name year specs (max, approx) comment
brain -500M 86+ billion neurons biological computer, developed by nature
abacus -2500 one of the simplest digital counting tools
Antikythera mechanism -125 ~30 gears, largest with 223 teeth 1st known analog comp., by Greeks (mechanical)
slide rule 1620 simple tool for multiplication and division
Shickard's calculating clock 1623 17 wheels 1st known calculator, could multiply, add and sub.
Arithmometer 1820 6 digit numebrs 1st commercial calculator (add, sub., mult.)
Difference Engine 1822 8 digit numbers, 24 axles, 96 wheels mech. digital comp. of polynomials, by Babbage
Analytical Engine design 1837 ~16K RAM, 40 digit numbers 1st general purpose comp, not realized, by Babbage
nomogram 1884 graphical/geometrical tools aiding computation
Z3 1941 176B RAM, CPU 10Hz 22bit 2600 relays 1st fully programmable electronic digital computer
ENIAC 1945 ~85B RAM, ~5KHz CPU, 18000 vaccum tubes 1st general purpose computer
PDP 11 1970 4M RAM, CPU 1.25Mhz 16bit legendary mini
Apple II 1977 64K RAM, 1MHz CPU 8bit popular TV-attached home computer by Apple
Atari 800 1979 8K RAM, CPU 1.7MHz 8bit popular TV-attached home computer by Atari
VIC 20 1980 32K RAM, 1MHz CPU 8bit, 20K ROM successful TV-connected home computer by Commodore
IBM PC 1981 256K RAM, CPU 4.7MHz 16bit, BASIC, DOS 1st personal computer as we know it now, modular
Commodore 64 1982 64K RAM, 20K ROM, CPU 1MHz 8bit very popular TV-connected home computer
ZX Spectrum 1982 128K RAM, CPU 3.5MHz 8bit successful UK TV-connected home comp. by Sinclair
NES/Famicom 1983 2K RAM, 2K VRAM, CPU 1.7MHz 8bit, PPU TV-connected Nintendo game console
Macintosh 1984 128K RAM, CPU 7MHz 32bit, floppy, 512x342 very popular personal computer by Apple
Amiga 1985 256K RAM, 256K ROM, CPU 7MHz 16bit, AmigaOS personal compuer by Commodore, ahead of its time
SNES 1990 128K RAM, 64K VRAM, CPU 21MHz 16bit game console, NES successor
PlayStation 1994 2M RAM, 1M VRAM, CPU 33MHz 32bit, CD-ROM popular TV-connected game console by Sony
TI-80 1995 7K RAM, CPU 980KHz, 48x64 1bit screen famous programmable graphing calculator
Deep Blue 1995 30 128MHz CPUs, ~11 GFLOPS 1st computer to defeat world chess champion
Nintendo 64 1996 8M RAM, CPU 93MHz 64bit, 64M ROM cartr. famous TV-connected game console
GameBoy Color 1998 32K RAM, 16K VRAM, CPU 2MHz 8bit, 160x144 handheld gaming console by Ninetendo
GameBoy Advance 2001 ~256K RAM, 96K VRAM, CPU 16MHz 32bit ARM, 240x160 successor to GBC
Xbox 2001 64M RAM, CPU 733MHz Pentium III TV-connected game console by Micro$oft
Nintendo DS 2004 4M RAM, 256K ROM, CPU ARM 67MHz, touchscreen famous handheld game console by Nintendo
iPhone (aka spyphone) 2007 128M RAM, CPU ARM 620MHz, GPU, cam., Wifi, 480x320 1st of the harmful Apple "smartphones"
ThinkPad X200 2008 8G RAM, CPU 2.6GHz, Wifi legendary laptop, great constr., freedom friendly
ThinkPad T400 2008 8G RAM, CPU 2.8GHz, Wifi legendary laptop, great constr., freedom friendly
Raspberry Pi 3 2016 1G RAM, CPU 1.4GHz ARM, Wifi very popular tiny inexpensive SBC
Arduboy 2016 2.5K RAM, CPU 16MHz AVR 8bit, 1b display tiny Arduino open console
Pokitto 2017 36K RAM, 256K ROM, CPU 72MHz ARM indie educational open console
Raspberry Pi 4 2019 8G RAM, CPU 1.5GHz ARM, Wifi tiny inexpensive SBC, usable as desktop
Deep Thought fictional computer from Hitchhiker's Guide ...
HAL 9000 fictional AI computer (2001: A Space Oddysey)
PD computer planned LRS computer
Turing machine important theoretical computer by Alan Turing

TODO: mnt reform 2, pinephone, NeXT, ti-89, quantum?, wii?


comun

Comun

Comun is a beautiful, greatly minimalist programming language made by drummyfish, based on his ideals of good, selfless technology known as less retarded software (LRS), of which it is now considered the official programming language. In the future it should gradually replace C as the preferred LRS language. The language has been inspired mainly by Forth, but also C, brainfuck and other ones. Though already usable, it is still in quite early stages of development and work in progress; currently there is a suckless implementation of comun in C and a number of supplemental materials such as a specification, tutorial and some example programs. The project repository is currently at https://codeberg.org/drummyfish/comun. The aim now is to make a self hosted implementation, i.e. write comun in comun. Though very young, comun is probably already the best programming language ever conceived.

The language is intended to be the foundation of a completely new, non-capitalist computer technology built from the ground up, which should culminate in the creation of the LRS much desired public domain computer. This technology is derived from the model of an ideal society and as such will aim for completely different goals (such as helping all living beings as much as possible without enslaving them) and values; this makes comun astronomically different in philosophy and design of the shitty, toxic capitalist joke languages such as C++ and Rust which pursue fascism, enslavement of humans to the productivity cult etc.

A quick sum up is following: comun is minimalist, low level with minimum abstraction, portable, imperative and stack-based, using reverse Polish notation. It can be both compiled and interpreted. There are only primitive integer data types (native integer size by default with possibility to specify exact width where necessary, signed/unsigned interpretation is left to the programmer) and optional pointers that can be used as variables, for managing multiple stacks, creating arrays etc. Its specification can fit on a sheet of paper and is completely public domain under CC0 (as is its current implementation). It has no standard library. There are no English keywords; commands are rather very short (mostly 1 to three symbols) math-like symbols. Source code only allows ASCII symbols (no unicode). There is an optional preprocessor that uses comun itself (i.e. it doesn't use any extra language). Functions and recursion are supported. Many features of the language are optional and never burden the programmer if he doesn't use them. Simplified versions of the language (minicomun and microcomun) are also specified.

Examples

Here is a very short showcase of comun code, demonstrating some common functions:

max: <' ? >< . ^ .      # takes maximum of two values

max3: max max .         # takes maximum of three values

# recursive factorial
factR:
  ?'
    $0 -- factR *
  ;
    ^ 1
  .
.

# iterative factorial
factI:
  $0 --

  @'
    >< $1 * ><
    --
  .
  ^
.

The following is a quine in comun:

0 46 32 34 S 34 32 58 83 S --> S: "0 46 32 34 S 34 32 58 83 S --> " .

The following code translates brainfuck to comun:

0 "$>0 " -->

@@
  <? ?
    <- 

    $0 "+" = $1 "-" = | ?
      $0 -> $0 -> " " ->
    .

    $0 "<" = $1 ">" = | ?
      "$" -> $0 -> "0" -> " " ->
    .

    $0 "." = ?
      0 "->' " -->
    .

    $0 "," = ?
      0 "$<0 <- " -->
    .

    $0 91 = ? # left bracket
      0 "@' " -->
    .

    $0 93 = ? # right bracker
      0 ". " -->
    .

    ^
  ;
    !@
  .
.

TODO: more, code examples, compare the above with C, ...


consumerism

Consumerism

CONSUME YOU FUCKING IDIOTIC BITCH YOU DON'T EVEN HAVE THE LATEST AI RAYTRACING ENCRYPTION GPUUUUUUUU 1080K WIRELESS GAYMING MONITOR WITH BLOCKCHAIN BUY IT BUYYYYYY IT YOU IDIOT. --capitalism

Consumerism (also consoomerism) is a built-in "feature" of capitalism that says EVERYTHING HAS TO BE CONSUMED on a regular short-term basis so as to keep CONSTANT P.R.O.G.R.E.S.S.^TM(c)/PERSONAL DEVELOPMENT GROWTHP/PRODUCTIVITY^tm^tm^tm^tm, even things that in theory could last decades to generations such as houses, cars, computers, software, even just INFORMATION etc. Yes, we could make nice durable machines that wouldn't break and would serve a man for generations, we could write a finished operating system that would work and be useful, but that wouldn't be good for the seller if he only sold the thing once in a hundred years, would it? ALERT ALERT: BAD FOR CAPITAL. He wants to sell the thing and then PROFIT from it every day as he lies on the beach being fucked by 10 billion whore lolitas, so the thing has to break regularly and just demand to be replaced once in a year or so (see planned obsolescence) -- haha, actually you know what would be best? WHAT IF :D WHAT IF WE RAPE THE CUSTOMER EVERY DAI, LMAOOOOOO What IF THERE ARE NO PRODUCTS BUT PRODUCT ARE ACTUALLY JUST SERVICES :DDDDD LMAO THEN NO ONE CAN OWN ANYTHING, YOUR CAR AND YOUR TOOTHBRUSH IS JUST A SUBSCRIPTION LOOOOL, it just stops running if you stop paying. Why do this? Because in capitalism economy MUSTN'T STOP ULTRA EXPONENTIALLY EXPLODING EVERY TRILLISECOND and EVERYONE MUST HAVE 10000 BILLION JOBS ELSE THE POOR WORKER LOSES THE MEANING OF LIFE like the neanderthals who lacked the good capitalist overlords that assure the basic human need of ultraexponential personal growth and all killed themselves, also the same with stupid lazy animals. So capitalism has to constantly GROOOOOOOOOOOOOOOOOOOOOOOOOOOWWWWWW FOR ANY COST JUST GROW GROW GROW GROW GROW GROOOOOOOOOOOOOOOOOWWWWWWWWWWWW -- is it good? No, but it's called PWOGWEESSS so people LOVE IT, people will shit themselves and suffocate their mouths with their shit just to hear the word PWOGWEEEEES (or alternatively UPDATE or ANTIPEDOPHILE PROTECTION), if a politician says PWEGWESS enough times in his speech the crowd will just start sucking his dick on the stage and he will win the elections by 130% majority. LMAOOOOO WHAT IF we make updates a kind of consumerist product, LOL WHAT IF one group of people build houses one day and the other group destroys them the other day and so on so on, it's INFINIITEEEEEEEEEEEEEEEEEEEE JOBS LOL :D I should fucking get into politics.

TODO


copyleft

Copyleft

Copyleft (also share-alike) is a concept of sharing something on the condition that others will share it under the same terms; this is practically always used by a subset of free (as in freedom) software and culture to legally ensure this software/art and its modifications will always remain free. This kind of hacks copyright to de-facto remove copyright by its own power.

Copyleft has been by its mechanisms likened to a virus because once it is applied to certain software, it "infects" it and will force its conditions on any descendants of that software, i.e. it will spread itself (in this case the word virus does not bear a negative connotation, at least to some, they see it as a good virus).

For free/open-source software the alternative to copyleft is so called permissive licensing which (same as with copyleft) grants all the necessary freedom rights, but does NOT require modified versions to grant these rights as well. This allows free software being forked and developed into proprietary software and is what copyleft proponents criticize. However, both copyleft and permissive licensing are free as in freedom.

In the FOSS world there is a huge battle between the copyleft camp and permissive camp (LRS advocates permissive licenses with a preference for 100% public domain).

Issues With Copyleft

In the great debate of copyleft vs permissive free licenses we, as technological anarchists, stand on the permissive side. Here are some reasons for why we reject copyleft:


copyright

Copyright

Copyright (better called copyrestriction or copywrong) is one of many types of so called "intellectual property" (IP), a legal concept that allows "ownership", i.e. restriction, censorship and artificial monopoly on certain kinds of information, for example prohibition of sharing or viewing useful information or improving art works. Copyright specifically allows the copyright holder (not necessarily the author) a monopoly (practically absolute power) over art creations such as images, songs or texts, which also include source code of computer programs. Copyright is a capitalist mechanism for creating artificial scarcity, enabling censorship and elimination of the public domain (a pool of freely shared works that anyone can use and benefit from). Copyright is not to be confused with trademarks, patents and other kinds of "intellectual property", which are similarly harmful but legally different. Copyright is symbolized by C in a circle or in brackets: (C), which is often accompanies by the phrase "all rights reserved".

When someone creates something that can even remotely be considered artistic expression (even such things as e.g. a mere collection of already existing things), he automatically gains copyright on it, without having to register it, pay any tax, announce it or let it be known anywhere in any way. He then practically has a full control over the work and can successfully sue anyone who basically just touches the work in any way (even unknowingly and unintentionally). Therefore any work (such as computer code) without a free license attached is implicitly fully "owned" by its creator (so called "all rights reserved") and can't be used by anyone without permission. It is said that copyright can't apply to ideas (ideas are covered by patents), only to expressions of ideas, however that's bullshit, the line isn't clear and is arbitrarily drawn by judges; for example regarding stories in books it's been established that the story itself can be copyrighted, not just its expression (e.g. you can't rewrite the Harry Potter story in different words and start selling it).

As if copyright wasn't bad enough of a cancer, there usually exist extra oppressive copyright-like restrictions called related rights or neighboring rights such as "moral rights", "personal rights" etc. Such "rights" differ a lot by country and can be used to restrict and censor even copyright-free works. This is a stuff that makes you want to commit suicide. Waivers such as CC0 try to waive copyright as well as neighboring rights (to what extent neighboring rights can be waived is debatable though).

The current extreme form of copyright (as well as other types of IP such as software patents) has been highly criticized by many people, even those whom it's supposed to "protect" (small game creators, musicians etc.). Strong copyright laws basically benefit mainly corporations and "trolls" on the detriment of everyone else. It smothers creativity and efficiency by prohibiting people to reuse, remix and improve already existing works -- something that's crucial for art, science, education and generally just making any kind of progress. Most people are probably for some form of copyright but still oppose the current extreme form which is pretty crazy: copyright applies to everything without any registration or notice and last usually 70 years (!!!) AFTER the author has died (!!!) and is already rotting in the ground. This is 100 years in some countries. In some countries it is not even possible to waive copyright to own creations -- just think about what kind of twisted society we are living in when it PROHIBITS people from making a selfless donation of their own creations to others. Some people, including us, are against the very idea of copyright (those may either use waivers such as CC0 or unlicense or protest by not using any licenses and simply ignoring copyright which however will actually discourage other people from reusing their works). Though copyright was originally intended to ensure artists can make living with their works, it has now become the tool of states and corporations for universal censorship, control, bullying, surveillance, creating scarcity and bullshit jobs; states can use copyright to for example take down old politically inconvenient books shared on the Internet even if such takedowns do absolute not serve protection of anyone's living but purely political interests.

Prominent critics of copyright include Lawrence Lessig (who established free culture and Creative Commons as a response), Nina Paley and Richard Stallman. There are many movements and groups opposing copyright or its current form, most notably e.g. the free culture movement, free software movement, Creative Commons etc.

The book Free Culture by Lessig talks, besides others, about how copyright has started and how it's been shaped by corporations to becoming their tool for monopolizing art. The concept of copyright has appeared after the invention of printing press. The so called Statute of Anne of 1710 allowed the authors of books to control their copying for 14 years and only after registartion. The term could be prolonged by anothert 14 years if the author survived. The laws started to get more and more strict as control of information became more valued and eventually the term grew to life of author plus 70 years, without any need for registration or deposit of the copy of the work. Furthermore with new technologies, the scope of copyright has also extended: if copyright originally only limited copying of books, in the Internet age it started to cover basically any use, as any manipulation of digital data in the computer age requires making local copies. Additionally the copyright laws were passing despite being unconstitutional as the US constitution says that copyright term has to be finite -- the corporations have found a way around this and simply regularly increased the copyright's term, trying to make it de-facto infinite (technically not infinite but ever increasing). Their reason, of course, was to firstly forever keep ownership of their own art but also, maybe more importantly, to kill the public domain, i.e. prevent old works from entering the public domain where they would become a completely free, unrestricted work for all people, competing with their proprietary art (who would pay for movies if there were thousands of movies available for free?). Nowadays, with coprporations such as YouTube and Facebook de-facto controlling most of infromation sharing among common people, the situation worsens further: they can simply make their own laws that don't need to be passed by the government but simply implemented on the platform they control. This way they are already killing e.g. the right to fair use, they can simply remove any content on the basis of "copyright violation", even if such content would normally NOT violate copyright because it would fall under fair use. This would normally have to be decided by court, but a corporation here itself takes the role of the court. So in terms of copyright, corporations have now a greater say than governments, and of course they'll use this power against the people (e.g. to implement censorship and surveillance).

Copyright rules differ greatly by country, most notably the US measures copyright length from the publication of the work rather than from when the author died. It is possible for a work to be copyrighted in one country and not copyrighted in another. It is sometimes also very difficult to say whether a work is copyrighted because the rules have been greatly changing (e.g. a notice used to be required for some time), sometimes even retroactively copyrighting public domain works, and there also exists no official database of copyrighted works (you can't safely look up whether your creation is too similar to someone else's). All in all, copyright is a huge mess, which is why we choose free licenses and even public domain waivers.

Copyleft (also share-alike) is a concept standing against copyright, a kind of anti-copyright, invented by Richard Stallman in the context of free software. It's a license that grants people the rights to the author's work on the condition that they share its further modification under the same terms, which basically hacks copyright to effectively spread free works like a "virus".

Copyright does not (or at least should not) apply to facts (including mathematical formulas) (even though the formulation of them may be copyrighted), ideas (though these may be covered by patents) and single words or short phrases (these may however still be trademarked) and similarly trivial works. As such copyright can't e.g. be applied to game mechanics of a computer game (it's an idea). It is also basically proven that copyright doesn't cover computer languages (Oracle vs Google). Also even though many try to claim so, copyright does NOT arise for the effort needed to create the work -- so called "sweat of the brow" -- some say that when it took a great effort to create something, the author should get a copyright on it, however this is NOT and must NOT be the case (otherwise it would be possible to copyright mere ideas, simple mathematical formulas, rules of games etc.). Depending on time and location there also exist various peculiar exceptions such as the freedom of panorama for photographs or uncopyrightable utilitarian design (e.g. no one can own the shape of a generic car). But it's never good to rely on these peculiarities as they are specific to time/location, they are often highly subjective, fuzzy and debatable and may even be retroactively changed by law. This constitutes a huge legal bloat and many time legal unsafety. Do not stay in the gray area, try to stay safely far away from the fuzzy copyright line.

A work which is not covered by copyright (and any other IP) -- which is nowadays pretty rare due to the extent and duration of copyright -- is in the public domain.

Free software (and free art etc.) is not automatically public domain, it is mostly still copyrighted, i.e. "owned" by someone, but the owner has given some key rights to everyone with a free software license and by doing so minimized or even eliminated the negative effects of full copyright. The owner may still keep the rights e.g. to being properly credited in all copies of the software, which he may enforce in court. Similarly software that is in public domain is not automatically free software -- this holds only if source code for this software is available (so that the rights to studying and modifying can be executed).

See Also


corporation

Corporation

Corporation is basically a huge company that doesn't have a single owner but is rather managed by many shareholders. Corporations are one of the most powerful, dangerous and unethical entities that ever came into existence -- their power is growing, sometimes even beyond the power of states and their sole goal is to make as much profit as possible without any sense of morality. Existence of corporations is enabled by capitalism.

The most basic fact to know about corporations is that 100% of everything a corporation ever does is done 100% solely for maximizing its own benefit for any cost, with no other reason, with 0 morality and without any consideration of consequences. If a corporation could make 1 cent by raping 1000000000 children and get away with it, it would do so immediately without any hesitation and any regret. This is very important to keep in mind. Now try to not get depressed at realization that corporations are those to whom we gave power and who are in almost absolute control of the world.

Corporation is not a human, it has zero sense of morality and no emotion. The most basic error committed by retards is to reply to this argument with "but corporations are run by humans". This is an extremely dangerous argument because somehow 99.999999999999999999% people believe this could be true and accept it as a comforting argument so that they can continue their daily lives and do absolutely nothing about the disastrous state of society. The argument is of course completely false for a number of reasons: firstly corporations exclusively hire psychopaths for manager roles -- any corporation that doesn't do this will be eliminated by natural selection of the market environment because it will be weaker in a fight against other corporations, and its place will be taken by the next aspiring corporation waiting in line. Secondly corporations are highly sophisticated machines that have strong mechanisms preventing any ethical behavior -- for example division of labor in the "just doing my job"/"everyone does it" style allows for many people collaborating on something extremely harmful and unethical without any single one feeling responsibility for the whole, or sometimes without people even knowing what they are really collaborating on. This is taken to perfection by corporations not even having a single responsible owner -- there is a group of shareholders, none of whom has a sole responsibility, and there is the CEO who is just a tool and puppet with tied hands who is just supposed to implement the collective bidding of shareholders. Of course, most just don't care, and most don't even have a choice. Similar principles allowed for example the Holocaust to happen. Anyone who has ever worked anywhere knows that managers always pressure workers just to make money, not to behave more ethically -- of course, such a manager would be fired on spot -- and indeed, workers that try to behave ethically are replaced by those who make more money, just as companies that try to behave ethically in the market are replaced by those that rather make money, i.e. corporations. This is nothing surprising, the definition of capitalism implies existence of a system with Darwinian evolution that selects entities that are best at making money for any cost, and that is exactly what we are getting. To expect any other outcome in capitalism would be just trying to deny mathematics itself.

A corporation is made to exploit people just as a gun is made to kill people. When a corporation commits a crime, it is not punished like a human would be, the corporation is left to exist and continue doing what it has been doing -- a supposed "punishment" for a corporation that has been caught red handed committing a crime is usually just replacing whoever is ruled to be "responsible", for example the CEO, which is of course ridiculous, the guy is just replaced with someone else who will do exactly the same. This is like trying to fix the lethal nature of a weapon by putting all the blame on a screw in the weapon, then replacing the screw with another one and expecting the weapon to no longer serve killing people.

There is probably nothing we can do to stop corporations from taking over the world and eventually eliminating humans, we have probably passed the capitalist singularity.

TODO


cos

Cosine

TODO


countercomplex

Countercomplex

"True progress is about deepness and compression instead of maximization and accumulation." -Viznut

Countercomplex is a blog (running since 2008) of a Finnish hacker and demoscener Viznut, criticizing technological complexity/bloat and promoting minimalism as a basis of truly good technology. It is accessible at http://countercomplex.blogspot.com/.


c_pitfalls

C Pitfalls

C is a powerful language that offers almost absolute control and maximum performance which necessarily comes with responsibility and danger of shooting oneself in the foot. Without knowledge of the pitfalls you may well find yourself fallen into one of them.

Unless specified otherwise, this article supposes the C99 standard of the C language.

Generally: be sure to check your programs with tools such as valgrind, splint, cppcheck, UBSan or ASan, and turn on compiler auto checks (-Wall, -Wextra, -pedantic, ...), it's quick, simple and reveals many bugs!

Undefined/Unspecified Behavior

Undefined (completely unpredictable), unspecified (safe but potentially differing) and implementation-defined (consistent within implementation but potentially differing between them) behavior poses a kind of unpredictability and sometimes non-intuitive, tricky behavior of certain operations that may differ between compilers, platforms or runs because they are not exactly described by the language specification; this is mostly done on purpose so as to allow some implementation freedom which allows implementing the language in a way that is most efficient on given platform. One has to be very careful about not letting such behavior break the program on platforms different from the one the program is developed on. Note that tools such as cppcheck can help find undefined behavior in code. Description of some such behavior follows.

There are tools for detecting undefined behavior, see e.g. clang's UBSan (https://clang.llvm.org/docs/UndefinedBehaviorSanitizer.html).

Data type sizes including int and char may not be the same on each platform. Even though we almost take it for granted that char is 8 bits wide, in theory it can be different (even though sizeof(char) is always 1). Int (and unsigned int) type width should reflect the architecture's native integer type, so nowadays it's mostly 32 or 64 bits. To deal with these differences we can use the standard library limits.h and stdint.h headers.

No specific endianness or even encoding of numbers is specified. Nowadays little endian and two's complement is what you'll encounter on most platforms, but e.g. PowerPC uses big endian ordering.

Order of evaluation of operands and function arguments is not specified. I.e. in an expression or function call it is not defined which operands or arguments will be evaluated first, the order may be completely random and the order may differ even when evaluating the same expression at another time. This is demonstrated by the following code:

#include <stdio.h>

int x = 0;

int a(void)
{
  x += 1;
  return x;
}

int main(void)
{
  printf("%d %d\n",x,a()); // may print 0 1 or 1 1
  return 0;
}

Overflow behavior of signed type operations is not specified. Sometimes we suppose that e.g. addition of two signed integers that are past the data type's limit will produce two's complement overflow (wrap around), but in fact this operation's behavior is undefined, C99 doesn't say what representation should be used for numbers. For portability, predictability and preventing bugs it is safer to use unsigned types (but safety may come at the cost of performance, i.e. you prevent compiler from performing some optimizations based on undefined behavior).

Bit shifts by type width or more are undefined. Also bit shifts by negative values are undefined. So e.g. x >> 8 is undefined if width of the data type of x is 8 bits or fewer.

Char data type signedness is not defined. The signedness can be explicitly "forced" by specifying signed char or unsigned char.

Floating point results are not precisely specified, no representation (such as IEEE 754) is specified and there may appear small differences in float operations under different machines or e.g. compiler optimization settings -- this may lead to nondeterminism.

Memory Unsafety

Besides being extra careful about writing memory safe code, one needs to also know that some functions of the standard library are memory unsafe. This is regarding mainly string functions such as strcpy or strlen which do not check the string boundaries (i.e. they rely on not being passed a string that's not zero terminated and so can potentially touch memory anywhere beyond); safer alternatives are available, they have an n added in the name (strncpy, strnlen, ...) and allow specifying a length limit.

Different Behavior Between C And C++ (And Different C Standards)

C is not a subset of C++, i.e. not every C program is a C++ program (for simple example imagine a C program in which we use the word class as an identifier: it is a valid C program but not a C++ program). Furthermore a C program that is at the same time also a C++ program may behave differently when compiled as C vs C++, i.e. there may be a semantic difference. Of course, all of this may also apply between different standards of C, not just between C and C++.

For portability sake it is good to try to write C code that will also compile as C++ (and behave the same). For this we should know some basic differences in behavior between C and C++.

One difference lies for example in pointers to string literals. While in C it is possible to have non-const pointers such as

char *s = "abc";

C++ requires any such pointer to be const, i.e.:

const char *s = "abc";

TODO: more examples

Compiler Optimizations

C compilers perform automatic optimizations and other transformations of the code, especially when you tell them to optimize aggressively (-O3) which is a standard practice to make programs run faster. However this makes compilers perform a lot of magic and may lead to unexpected and unintuitive undesired behavior such as bugs or even the "unoptimization of code". { I've seen a code I've written have bigger size when I set the -Os flag (optimize for smaller size). ~drummyfish }

Aggressive optimization may firstly lead to tiny bugs in your code manifesting in very weird ways, it may happen that a line of code somewhere which may somehow trigger some tricky undefined behavior may cause your program to crash in some completely different place. Compilers exploit undefined behavior to make all kinds of big brain reasoning and when they see code that MAY lead to undefined behavior a lot of chain reasoning may lead to very weird compiled results. Remember that undefined behavior, such as overflow when adding signed integers, doesn't mean the result is undefined, it means that ANYTHING CAN HAPPEN, the program may just start printing nonsensical stuff on its own or your computer may explode. So it may happen that the line with undefined behavior will behave as you expect but somewhere later on the program will just shit itself. For these reasons if you encounter a very weird bug, try to disable optimizations and see if it goes away -- if it does, you may be dealing with this kind of stuff. Also check your program with tools like cppcheck.

Automatic optimizations may also be dangerous when writing multithreaded or very low level code (e.g. a driver) in which the compiler may have wrong assumptions about the code such as that nothing outside your program can change your program's memory. Consider e.g. the following code:

while (x)
  puts("X is set!");

Normally the compiler could optimize this to:

if (x)
  while (1)
    puts("X is set!");

As in typical code this works the same and is faster. However if the variable x is part of shared memory and can be changed by an outside process during the execution of the loop, this optimization can no longer be done as it results in different behavior. This can be prevented with the volatile keyword which tells the compiler to not perform such optimizations.

Of course this applies to other languages as well, but C is especially known for having a lot of undefined behavior, so be careful.

Other

Watch out for operator precedence! Bracket expressions if unsure, or just to increase readability for others.

Also watch out for this one: != is not =! :) I.e. if (x != 4) and if (x =! 4) are two different things, the first means not equal and is usually what you want, the latter is two operations, = and !, the tricky thing is it also compiles and may work as expected in some cases but fail in others, leading to a very nasty bug.


cpp

C++

C++ is an object-obsessed joke language based on C to which it adds only capitalist features and bloat, most notably object obsession. Most good programmers such as Richard Stallman and Linus Torvalds agree that C++ is hilariously messy and also tragic in that it actually succeeded to become mainstream. The language creator Bjarne Stroustrup himself infamously admitted the language suck but laughs at its critics because it became successful anyway -- indeed, in a retarded society only shit can succeed. As someone once said, "C++ is not an increment, it is excrement".


cracker

Cracker

Crackers are either "bad hackers" that break into computer systems or the good people who with the power of hacking remove artificial barriers to obtaining and sharing infomration; for example they help remove DRM from games or leak data from secret databases. This is normally illegal which makes the effort even more admirable.

Cracker is also food.

Cracker is also the equivalent of nigger-word for the white people.


cracking

Cracking

See cracker.


creative_commons

Creative Commons

Creative Commons (CC) is the forefront non-profit organization promoting free culture, i.e. basically relaxation of "intellectual property" (such as copyright) in art. One of the most important contributions of the organization are the widely used Creative Commons licenses which artists may use to make their works more legally free and even put them to the public domain.

TODO

Licenses

Creative commons licenses/waivers form a spectrum spanning from complete freedom (CC0, public domain, no conditions on use) to complete fascism (prohibiting basically everything except for non-commercial sharing). This means that NOT all Creative Commons licenses are free cultural licenses -- this is acknowledged by Creative Commons and has been intended. Keep in mind that as a good human you mustn't ever use licenses with NC (non-commercial use only) or ND (no derivatives allowed) clauses, these make your work non-free and therefore unusable.

Here is a comparison of the Creative Commons licenses/waivers, from most free (best) to least free (worst):

name abbreviation free culture use share remix copyleft attribution non-commercial comment
Creative Commons Zero CC0 yes :) yes :) yes :) yes :) no :) no need :) no :) public domain, copyright waiver, no restrictions, most freedom, best, sadly doesn't waive patents and trademraks
Creative Commons Attribution CC BY yes :) yes :) yes :) yes :) no :) forced :( no :) no restrictions except for requiring attribution to authors
Creative Commons Sharealike CC SA yes :) yes :) yes :) yes :) yes :/ no need :) no :) retired, secret license, no longer recommended by CC, pure copyleft/sharealike without forced attribution
Creative Commons Attribution Sharealike CC BY-SA yes :) yes :) yes :) yes :) yes :/ forced :( no :) requires attribution to authors and copyleft (sharing under same terms)
Creative Commons Attribution NonCommercial CC BY-NC NO! :((( yes but yes but yes but yes :/ forced :( yes :( proprietary fascist license prohibiting commercial use, DO NOT USE
Creative Commons Attribution NoDerivs CC BY-ND NO! :((( yes but yes but NO! :( yes :/ forced :( no but proprietary fascist license prohibiting modifications, DO NOT USE
Creative Commons Attribution NonCommercial NoDerivs CC BY-NC-ND NO! :((( yes but yes but NO! :( yes :/ forced :( yes :( proprietary fascist license prohibiting commercial use and even modifications, DO NOT USE
none (all rights reserved) NO! :((( NO! :( NO! :( NO! :( FUCK YOU FUCK YOU FUCK YOU proprietary fascist option, prohibits everything, DO NOT USE

See Also


crime_against_economy

Crime Against Economy

Crime against economy refers to any bullshit "crime" invented by capitalism that is deemed to "hurt economy", the new God of society. In the current dystopian society where money has replaced God, worshiping economy is the new religion; to satisfy economy human and animal lives are sacrificed just as such sacrifices used to be made to please the gods of ancient times.

Examples of crimes against economy include:


crow_funding

Crow Funding

Crow funding is when a crow pays for your program.

You probably misspelled crowd funding.


crypto

Cryptocurrency

Cryptocurrency, or just crypto, is a digital, virtual (non-physical) currency used on the Internet which uses cryptographic methods (electronic signatures etc.) to implement a decentralized system in which there is no authority to control the currency (unlike e.g. with traditional currencies that are controlled by the state or systems of digital payments controlled by the banks that run these systems). Cryptocurrencies use so called blockchain as an underlying technology and are practically always implemented as free and open-source software. Example of cryptocurrencies are Bitcoin, Monero or Dogecoin.

The word crypto in crpytocurrency doesn't imply that the currency provides or protects privacy -- it rather refers to the cryptographic algorithms used to make the currency work -- even though thanks to the decentralization, anonymity and openness cryptocurrencies actually are mostly privacy friendly (up to the points of being considered the currency of criminals).

LRS sees cryptocurrencies as not only unnecessary bullshit, but downright as an unethical technology because money itself is unethical, plus the currencies based on proof of work waste not only human effort but also enormous amount of electricity and computing power that could be spent in a better way. Keep in mind that cryptocurrencies are part of cryptofascism; they're a way of digitizing harmful concepts existing in society. Crypto is just an immensely expensive game in which people try to fuck each other over money that have been stolen from the people.

History

TODO

How It Works

Cryptocurrency is build on top of so called blockchain -- a kind structure that holds records of transactions (exchanges of money or "coins", as called in the crypto world). Blockchain is a data structure serving as a database of the system. As its name suggests, it consists of blocks. Each block contains various data, most important of which are performed transactions (e.g. "A sent 1 coin to B"), and each block points to a previous one (forming a linked list). As new transactions are made, new blocks are created and appended at the end of the blockchain.

But where is the blockchain stored? It is not on a single computer; many computers participating in the system have their own copy of the blockchain and they share it together (similarly to how people share files via torrents).

But how do we know which one is the "official" blockchain? Can't just people start forging information in the blockchain and then distribute the fake blockchains? Isn't there a chaos if there are so many copies? Well yes, it would be messy -- that's why we need a consensus of the participants on which blockchain is the real one. And there are a few algorithms to ensure the consensus. Basically people can't just spam add new blocks, a new block to be added needs to be validated via some process (which depends on the specific algorithm) in order to be accepted by others. Two main algorithms for this are:

Can't people just forge transactions, e.g. by sending out a record that says someone else sent them money? This can be easily prevented by digitally signing the transactions, i.e. if there is e.g. a transaction "A sends 1 coint to B", it has to be signed by A to confirm that A really intended to send the money. But can't someone just copy-paste someone else's already signed transactions and try to perform them multiple times? This can also be prevented by e.g. numbering the transactions, i.e. recording something like "A sent 1 coin to B as his 1st transaction".

But where are one's coins actually stored? They're not explicitly stored anywhere; the amount of coins any participant has is deduced from the list of transactions, i.e. if it is known someone joined the network with 0 coins and there is a record of someone else sending him 1 coin, it is clear he now has 1 coin. For end users there are so called wallets which to them appear to store their coins, but a wallet is in fact just the set of cryptographic keys needed to perform transactions.

But why is blockchain even needed? Can't we just have a list of signed transactions without any blocks? Well, blockchain is designed to ensure coherency and the above mentioned consensus.


c_sharp

C Sharp

C# is supposed to be a "programming language" but it's just some capitalist shit by Micro$oft that's supposed to give it some kind of monopoly. Really it's not even worth writing about. It's like Java but worse. I'm tired, DO NOT USE THIS PSEUDOSHIT. Learn C.


c_tutorial

C Tutorial

{ Still a work in progress, but 99% complete. ~drummyfish }

This is a relatively quick C tutorial.

You should probably know at least the completely basic ideas of programming before reading this (what's a programming language, source code, command line etc.). If you're as far as already knowing another language, this should be pretty easy to understand.

About C And Programming

C is

If you come from a language like Python or JavaScript, you may be shocked that C doesn't come with its own package manager, debugger or build system, it doesn't have modules, generics, garabage collection, OOP, hashmaps, dynamic lists, type inference and similar "modern" features. When you truly get into C, you'll find it's a good thing.

Programming in C works like this:

  1. You write a C source code into a file.
  2. You compile the file with a C compiler such as gcc (which is just a program that turns source code into a runnable program). This gives you the executable program.
  3. You run the program, test it, see how it works and potentially get back to modifying the source code (step 1).

So, for writing the source code you'll need a text editor; any plain text editor will do but you should use some that can highlight C syntax -- this helps very much when programming and is practically a necessity. Ideal editor is vim but it's a bit difficult to learn so you can use something as simple as Gedit or Geany. We do NOT recommend using huge programming IDEs such as "VS Code" and whatnot. You definitely can NOT use an advanced document editor that works with rich text such as LibreOffice or that shit from Micro$oft, this won't work because it's not plain text.

Next you'll need a C compiler, the program that will turn your source code into a runnable program. We'll use the most commonly used one called gcc (you can try different ones such as clang or tcc if you want). If you're on a Unix-like system such as GNU/Linux (which you probably should), gcc is probably already installed. Open up a terminal and write gcc to see if it's installed -- if not, then install it (e.g. with sudo apt install build-essential if you're on a Debian-based system).

If you're extremely lazy, there are online web C compilers that work in a web browser (find them with a search engine). You can use these for quick experiments but note there are some limitations (e.g. not being able to work with files), and you should definitely know how to compile programs yourself.

Last thing: there are multiple standards of C. Here we will be covering C99, but this likely doesn't have to bother you at this point.

First Program

Let's quickly try to compile a tiny program to test everything and see how everything works in practice.

Open your text editor and paste this code:

/* simple C program! */

#include <stdio.h> // include IO library

int main(void)
{
  puts("It works.");
  
  return 0;
}

Save this file and name it program.c. Then open a terminal emulator (or an equivalent command line interface), locate yourself into the directory where you saved the file (e.g. cd somedirectory) and compile the program with the following command:

gcc -o program program.c

The program should compile and the executable program should appear in the directory. You can run it with

./program

And you should see

It works.

written in the command line.

Now let's see what the source code means:

Also notice how the source code is formatted, e.g. the indentation of code withing the { and } brackets. White characters (spaces, new lines, tabs) are ignored by the compiler so we can theoretically write our program on a single line, but that would be unreadable. We use indentation, spaces and empty lines to format the code to be well readable.

To sum up let's see a general structure of a typical C program. You can just copy paste this for any new program and then just start writing commands in the main function.

#include <stdio.h> // include the I/O library
// more libraries can be included here

int main(void)
{
  // write commands here
  
  return 0; // always the last command
}

Variables, Arithmetic, Data Types

Programming is a lot like mathematics, we compute equations and transform numerical values into other values. You probably know in mathematics we use variables such as x or y to denote numerical values that can change (hence variables). In programming we also use variables -- here variable is a place in memory which has a name (and in this place there will be stored a value that can change over time).

We can create variables named x, y, myVariable or score and then store specific values (for now let's only consider numbers) into them. We can read from and write to these variables at any time. These variables physically reside in RAM, but we don't really care where exactly (at which address) they are located -- this is e.g. similar to houses, in common talk we normally say things like John's house or the pet store instead of house with address 3225.

Variable names can't start with a digit (and they can't be any of the keywords reserved by C). By convention they also shouldn't be all uppercase or start with uppercase (these are normally used for other things). Normally we name variables like this: myVariable or my_variable (pick one style, don't mix them).

In C as in other languages each variable has a certain data type; that is each variable has associated an information of what kind of data is stored in it. This can be e.g. a whole number, fraction, a text character, text string etc. Data types are a more complex topic that will be discussed later, for now we'll start with the most basic one, the integer type, in C called int. An int variable can store whole numbers in the range of at least -32768 to 32767 (but usually much more).

Let's see an example.

#include <stdio.h>

int main(void)
{
  int myVariable;
  
  myVariable = 5;
  
  printf("%d\n",myVariable);
  
  myVariable = 8;
  
  printf("%d\n",myVariable);
}

After compiling and running of the program you should see:

5
8

Last thing to learn is arithmetic operators. They're just normal math operators such as +, - and /. You can use these along with brackets (( and )) to create expressions. Expressions can contain variables and can themselves be used in many places where variables can be used (but not everywhere, e.g. on the left side of variable assignment, that would make no sense). E.g.:

#include <stdio.h>

int main(void)
{
  int heightCm = 175;
  int weightKg = 75;
  int bmi = (weightKg * 10000) / (heightCm * heightCm);

  printf("%d\n",bmi);
}

calculates and prints your BMI (body mass index).

Let's quickly mention how you can read and write values in C so that you can begin to experiment with your own small programs. You don't have to understand the following syntax as of yet, it will be explained later, now simply copy-paste the commands:

Branches And Loops (If, While, For)

When creating algorithms, it's not enough to just write linear sequences of commands. Two things (called control structures) are very important to have in addition:

Let's start with branches. In C the command for a branch is if. E.g.:

if (x > 10)
  puts("X is greater than 10.");

The syntax is given, we start with if, then brackets (( and )) follow inside which there is a condition, then a command or a block of multiple commands (inside { and }) follow. If the condition in brackets holds, the command (or block of commands) gets executed, otherwise it is skipped.

Optionally there may be an else branch which is gets executed only if the condition does NOT hold. It is denoted with the else keyword which is again followed by a command or a block of multiple commands. Branching may also be nested, i.e. branches may be inside other branches. For example:

if (x > 10)
  puts("X is greater than 10.");
else
{
  puts("X is not greater than 10.");

  if (x < 5)
    puts("And it is also smaller than 5.");
}

So if x is equal e.g. 3, the output will be:

X is not greater than 10.
And it is also smaller than 5.

About conditions in C: a condition is just an expression (variables/functions along with arithmetic operators). The expression is evaluated (computed) and the number that is obtained is interpreted as true or false like this: in C 0 (zero) means false, 1 (and everything else) means true. Even comparison operators like < and > are technically arithmetic, they compare numbers and yield either 1 or 0. Some operators commonly used in conditions are:

E.g. an if statement starting as if (x == 5 || x == 10) will be true if x is either 5 or 10.

Next we have loops. There are multiple kinds of loops even though in theory it is enough to only have one kind of loop (there are multiple types out of convenience). The loops in C are:

The while loop is used when we want to repeat something without knowing in advance how many times we'll repeat it (e.g. searching a word in text). It starts with the while keyword, is followed by brackets with a condition inside (same as with branches) and finally a command or a block of commands to be looped. For instance:

while (x > y) // as long as x is greater than y
{
  printf("%d %d\n",x,y); // prints x and y  

  x = x - 1; // decrease x by 1
  y = y * 2; // double y
}

puts("The loop ended.");

If x and y were to be equal 100 and 20 (respectively) before the loop is encountered, the output would be:

100 20
99 40
98 60
97 80
The loop ended.

The for loop is executed a fixed number of time, i.e. we use it when we know in advance how many time we want to repeat our commands. The syntax is a bit more complicated: it starts with the keywords for, then brackets (( and )) follow and then the command or a block of commands to be looped. The inside of the brackets consists of an initialization, condition and action separated by semicolon (;) -- don't worry, it is enough to just remember the structure. A for loop may look like this:

puts("Counting until 5...");

for (int i = 0; i < 5; ++i)
  printf("%d\n",i); // prints i

int i = 0 creates a new temporary variable named i (name normally used by convention) which is used as a counter, i.e. this variable starts at 0 and increases with each iteration (cycle), and it can be used inside the loop body (the repeated commands). i < 5 says the loop continues to repeat as long as i is smaller than 5 and ++i says that i is to be increased by 1 after each iteration (++i is basically just a shorthand for i = i + 1). The above code outputs:

Counting until 5...
0
1
2
3
4

IMPORTANT NOTE: in programming we count from 0, not from 1 (this is convenient e.g. in regards to pointers). So if we count to 5, we get 0, 1, 2, 3, 4. This is why i starts with value 0 and the end condition is i < 5 (not i <= 5).

Generally if we want to repeat the for loop N times, the format is for (int i = 0; i < N; ++i).

Any loop can be exited at any time with a special command called break. This is often used with so called infinite loop, a while loop that has 1 as a condition; recall that 1 means true, i.e. the loop condition always holds and the loop never ends. break allows us to place conditions in the middle of the loop and into multiple places. E.g.:

while (1) // infinite loop
{
  x = x - 1;
  
  if (x == 0)
    break; // this exits the loop!
    
  y = y / x;
}

The code above places a condition in the middle of an infinite loop to prevent division by zero in y = y / x.

Again, loops can be nested (we may have loops inside loops) and also loops can contain branches and vice versa.

Simple Game: Guess A Number

With what we've learned so far we can already make a simple game: guess a number. The computer thinks a random number in range 0 to 9 and the user has to guess it. The source code is following.

#include <stdio.h>
#include <stdlib.h>
#include <time.h>

int main(void)
{
  srand(clock()); // random seed
  
  while (1) // infinite loop
  {
    int randomNumber = rand() % 10;
      
    puts("I think a number. What is it?");
    
    int guess;
    
    scanf("%d",&guess); // read the guess
    
    getchar();

    if (guess == randomNumber)
      puts("You guessed it!");
    else
      printf("Wrong. The number was %d.\n",randomNumber);
      
    puts("Play on? [y/n]");
    
    char answer;

    scanf("%c",&answer); // read the answer
    
    if (answer == 'n')
      break;
  }

  puts("Bye.");
  
  return 0; // return success, always here
}

Functions (Subprograms)

Functions are extremely important, no program besides the most primitive ones can be made without them (well, in theory any program can be created without functions, but in practice such programs would be extremely complicated and unreadable).

Function is a subprogram (in other languages functions are also called procedures or subroutines), i.e. it is code that solves some smaller subproblem that you can repeatedly invoke, for instance you may have a function for computing a square root, for encrypting data or for playing a sound from speakers. We have already met functions such as puts, printf or rand.

Functions are similar to but NOT the same as mathematical functions. Mathematical function (simply put) takes a number as input and outputs another number computed from the input number, and this output number depends only on the input number and nothing else. C functions can do this too but they can also do additional things such as modify variables in other parts of the program or make the computer do something (such as play a sound or display something on the screen) -- these are called side effects; things done besides computing an output number from an input number. For distinction mathematical functions are called pure functions and functions with side effects are called non-pure.

Why are function so important? Firstly they help us divide a big problem into small subproblems and make the code better organized and readable, but mainly they help us respect the DRY (Don't Repeat Yourself) principle -- this is extremely important in programming. Imagine you need to solve a quadratic equation in several parts of your program; you do NOT want to solve it in each place separately, you want to make a function that solves a quadratic equation and then only invoke (call) that function anywhere you need to solve your quadratic equation. This firstly saves space (source code will be shorter and compiled program will be smaller), but it also makes your program manageable and eliminates bugs -- imagine you find a better (e.g. faster) way to solving quadratic equations; without functions you'd have to go through the whole code and change the algorithm in each place separately which is impractical and increases the chance of making errors. With functions you only change the code in one place (in the function) and in any place where your code invokes (calls) this function the new better and updated version of the function will be used.

Besides writing programs that can be directly executed programmers write libraries -- collections of functions that can be used in other projects. We have already seen libraries such as stdio, standard input/output library, a standard (official, bundled with every C compiler) library for input/output (reading and printing values); stdio contains functions such as puts which is used to printing out text strings. Examples of other libraries are the standard math library containing function for e.g. computing sine, or SDL, a 3rd party multimedia library for such things as drawing to screen, playing sounds and handling keyboard and mouse input.

Let's see a simple example of a function that writes out a temperature in degrees of Celsius as well as in Kelvin:

#include <stdio.h>

void writeTemperature(int celsius)
{
  int kelvin = celsius + 273;
  printf("%d C (%d K)\n",celsius,kelvin);
}

int main(void)
{
  writeTemperature(-50);
  writeTemperature(0);
  writeTemperature(100);

  return 0;
}

The output is

-50 C (223 K)
0 C (273 K)
100 C (373 K)

Now imagine we decide we also want our temperatures in Fahrenheit. We can simply edit the code in writeTemperature function and the program will automatically be writing temperatures in the new way.

Let's see how to create and invoke functions. Creating a function in code is done between inclusion of libraries and the main function, and we formally call this defining a function. The function definition format is following:

RETURN_TYPE FUNCTION_NAME(FUNCTION_PARAMETERS)
{
  FUNCTION_BODY
}

Let's see another function:

#include <stdio.h>

int power(int x, int n)
{
  int result = 1;
  
  for (int i = 0; i < n; ++i) // repeat n times
    result = result * x;
    
  return result;
}

int main(void)
{
  for (int i = 0; i < 5; ++i)
  {
    int powerOfTwo = power(2,i);
    printf("%d\n",powerOfTwo);
  }

  return 0;
}

The output is:

2
4
8
16

The function power takes two parameters: x and n, and returns x raised to the ns power. Note that unlike the first function we saw here the return type is int because this function does return a value. Notice the command return -- it is a special command that causes the function to terminate and return a specific value. In functions that return a value (their return type is not void) there has to be a return command. In function that return nothing there may or may not be one, and if there is, it has no value after it (return;);

Let's focus on how we invoke the function -- in programming we say we call the function. The function call in our code is power(2,i). If a function returns a value (return type is not void), its function call can be used in any expression, i.e. almost anywhere where we can use a variable or a numerical value -- just imagine the function computes a return value and this value is substituted to the place where we call the function. For example we can imagine the expression power(3,1) + power(3,0) as simply 3 + 1.

If a function returns nothing (return type is void), it can't be used in expressions, it is used "by itself"; e.g. playBeep();. (Function that do return a value can also be used like this -- their return value is in this case simply ignored.)

We call a function by writing its name (power), then adding brackets (( and )) and inside them we put arguments -- specific values that will substitute the corresponding parameters inside the function (here x will take the value 2 and n will take the current value of i). If the function takes no parameters (the function parameter list is void), we simply put nothing inside the brackets (e.g. playBeep(););

Here comes the nice thing: we can nest function calls. For example we can write x = power(3,power(2,1)); which will result in assigning the variable x the value of 9. Functions can also call other functions (even themselves, see recursion), but only those that have been defined before them in the source code (this can be fixed with so called forward declarations).

Notice that the main function we always have in our programs is also a function definition. The definition of this function is required for runnable programs, its name has to be main and it has to return int (an error code where 0 means no error). It can also take parameters but more on that later.

These is the most basic knowledge to have about C functions. Let's see one more example with some pecularities that aren't so important now, but will be later.

#include <stdio.h>

void writeFactors(int x) // writes divisord of x
{
  printf("factors of %d:\n",x);
  
  while (x > 1) // keep dividing x by its factors
  {
    for (int i = 2; i <= x; ++i) // search for a factor
      if (x % i == 0) // i divides x without remainder?
      {
        printf("  %d\n",i); // i is a factor, write it
        x = x / i; // divide x by i
        break; // exit the for loop
      }
  }
}

int readNumber(void)
{
  int number;
  
  puts("Please enter a number to factor (0 to quit).");
  scanf("%d",&number);
  
  return number;
}

int main(void)
{
  while (1) // infinite loop
  {
    int number = readNumber(); // <- function call

    if (number == 0) // 0 means quit
      break;
      
    writeFactors(number); // <- function call
  }
    
  return 0;
}

We have defined two functions: writeFactors and readNumber. writeFactors return no values but it has side effects (print text to the command line). readNumber takes no parameters but return a value; it prompts the user to enter a value and returns the read value.

Notice that inside writeFactors we modify its parameter x inside the function body -- this is okay, it won't affect the argument that was passed to this function (the number variable inside the main function won't change after this function call). x can be seen as a local variable of the function, i.e. a variable that's created inside this function and can only be used inside it -- when writeFactors is called inside main, a new local variable x is created inside writeFactors and the value of number is copied to it.

Another local variable is number -- it is a local variable both in main and in readNumber. Even though the names are the same, these are two different variables, each one is local to its respective function (modifying number inside readNumber won't affect number inside main and vice versa).

And a last thing: keep in mind that not every command you write in C program is a function call. E.g. control structures (if, while, ...) and special commands (return, break, ...) are not function calls.

More Details (Globals, Switch, Float, Forward Decls, ...)

We've skipped a lot of details and small tricks for simplicity. Let's go over some of them. Many of the following things are so called syntactic sugar: convenient syntax shorthands for common operations.

Multiple variables can be defined and assigned like this:

int x = 1, y = 2, z;

The meaning should be clear, but let's mention that z doesn't generally have a defined value here -- it will have a value but you don't know what it is (this may differ between different computers and platforms). See undefined behavior.

The following is a shorthand for using operators:

x += 1;      // same as: x = x + 1;
x -= 10;     // same as: x = x - 1;
x *= x + 1;  // same as: x = x * (x + 1);
x++;         // same as: x = x + 1;
x--;         // same as: x = x - 1;
// etc.

The last two constructs are called incrementing and decrementing. This just means adding/subtracting 1.

In C there is a pretty unique operator called the ternary operator (ternary for having three operands). It can be used in expressions just as any other operators such as + or -. Its format is:

CONDITION ? VALUE1 : VALUE2

It evaluates the CONDITION and if it's true (non-0), this whole expression will have the value of VALUE1, otherwise its value will be VALUE2. It allows for not using so many ifs. For example instead of

if (x >= 10)
  x -= 10;
else
  x = 10;

we can write

x = x >= 10 ? x - 10 : 10;

Global variables: we can create variables even outside function bodies. Recall than variables inside functions are called local; variables outside functions are called global -- they can basically be accessed from anywhere and can sometimes be useful. For example:

#include <stdio.h>
#include <stdlib.h> // for rand()

int money = 0; // total money, global variable

void printMoney(void)
{
  printf("I currently have $%d.\n",money);
}

void playLottery(void)
{
  puts("I'm playing lottery.");
  
  money -= 10; // price of lottery ticket
    
  if (rand() % 5) // 1 in 5 chance
  {
    money += 100;
    puts("I've won!");
  }
  else
    puts("I've lost!");

  printMoney();
}

void work(void)
{
  puts("I'm going to work :(");
  
  money += 200; // salary

  printMoney();
}

int main()
{
  work();
  playLottery();
  work();
  playLottery();
  
  return 0;
}

In C programs you may encounter a switch statement -- it is a control structure similar to a branch if which can have more than two branches. It looks like this:

  switch (x)
  {
    case 0: puts("X is zero. Don't divide by it."); break;
    case 69: puts("X is 69, haha."); break;
    case 42: puts("X is 42, the answer to everything."); break;
    default: printf("I don't know anything about X."); break;
  }

Switch can only compare exact values, it can't e.g. check if a value is greater than something. Each branch starts with the keyword case, then the match value follows, then there is a colon (:) and the branch commands follow. IMPORTANT: there has to be the break; statement at the end of each case branch (we won't go into details). A special branch is the one starting with the word default that is executed if no case label was matched.

Let's also mention some additional data types we can use in programs:

Here is a short example with the new data types:

#include <stdio.h>

int main(void)
{
  char c;
  float f;
  
  puts("Enter character.");
  c = getchar(); // read character
  
  puts("Enter float.");
  scanf("%f",&f);
  
  printf("Your character is :%c.\n",c);
  printf("Your float is %lf\n",f);
 
  float fSquared = f * f;
  int wholePart = f; // this can be done
  
  printf("It's square is %lf.\n",fSquared);
  printf("It's whole part is %d.\n",wholePart);
  
  return 0;
}

Notice mainly how we can assign a float value into the variable of int type (int wholePart = f;). This can be done even the other way around and with many other types. C can do automatic type conversions (casting), but of course, some information may be lost in this process (e.g. the fractional part).

In the section about functions we said a function can only call a function that has been defined before it in the source code -- this is because the compiler read the file from start to finish and if you call a function that hasn't been defined yet, it simply doesn't know what to call. But sometimes we need to call a function that will be defined later, e.g. in cases where two functions call each other (function A calls function B in its code but function B also calls function A). For this there exist so called forward declaractions -- a forward declaration is informing that a function of certain name (and with certain parameters etc.) will be defined later in the code. Forward declaration look the same as a function definition, but it doesn't have a body (the part between { and }), instead it is terminated with a semicolon (;). Here is an example:

#include <stdio.h>

void printDecorated2(int x, int fancy); // forward declaration

void printDecorated1(int x, int fancy)
{
  putchar('~');
  
  if (fancy)
    printDecorated2(x,0); // would be error without f. decl. 
  else
    printf("%d",x);
  
  putchar('~');
}

void printDecorated2(int x, int fancy)
{
  putchar('>');
  
  if (fancy)
    printDecorated1(x,0);
  else
    printf("%d",x);
  
  putchar('<');
}

int main()
{
  printDecorated1(10,1);
  putchar('\n'); // newline
  printDecorated2(20,1);
}

which prints

~>10<~
>~20~<

The functions printDecorated1 and printDecorated2 call each other, so this is the case when we have to use a forward declaration of printDecorated2. Also note the condition if (fancy) which is the same thing as if (fancy != 0) (imagine fancy being 1 and 0 and about what the condition evaluates to in each case).

Header Files, Libraries, Compilation/Building

So far we've only been writing programs into a single source code file (such as program.c). More complicated programs consist of multiple files and libraries -- we'll take a look at this now.

In C we normally deal with two types of source code files:

When we have multiple source code files, we typically have pairs of .c and .h files. E.g. if there is a library called mathfunctions, it will consist of files mathfunctions.c and mathfunctions.h. The .h file will contain the function headers (in the same manner as with forward declarations) and constants such as pi. The .c file will then contain the implementations of all the functions declared in the .h file. But why do we do this?

Firstly .h files may serve as a nice documentation of the library for programmers: you can simply open the .h file and see all the functions the library offers without having to skim over thousands of lines of code. Secondly this is for how multiple source code files are compiled into a single executable program.

Suppose now we're compiling a single file named program.c as we've been doing until now. The compilation consists of several steps:

  1. The compiler reads the file program.c and makes sense of it.
  2. It then creates an intermediate file called program.o. This is called an object file and is a binary compiled file which however cannot yet be run because it is not linked -- in this code all memory addresses are relative and it doesn't yet contain the code from external libraries (e.g. the code of printf).
  3. The compiler then runs a linker which takes the file program.o and the object files of libraries (such as the stdio library) and it puts them all together into the final executable file called program. This is called linking; the code from the libraries is copied to complete the code of our program and the memory addresses are settled to some specific values.

So realize that when the compiler is compiling our program (program.c), which contains function such as printf from a separate library, it doesn't have the code of these functions available -- this code is not in our file. Recall that if we want to call a function, it must have been defined before and so in order for us to be able to call printf, the compiler must know about it. This is why we include the stdio library at the top of our source code with #include <stdio.h> -- this basically copy-pastes the content of the header file of the stdio library to the top of our source code file. In this header there are forward declarations of functions such as printf, so the compiler now knows about them (it knows their name, what they return and what parameters they take) and we can call them.

Let's see a small example. We'll have the following files (all in the same directory).

library.h (the header file):

// Returns the square of n.
int square(int n);

library.c (the implementation file):

int square(int x)
{
  // function implementation
  return x * x;
}

program.c (main program):

#include <stdio.h>
#include "library.h"

int main(void)
{
  int n = square(5);

  printf("%d\n",n);

  return 0;
}

NOTE: "library.h" here is between double quotes, unlike <stdio.h>. This just says we specify an absolute path to the file as it's not in the directory where installed libraries go.

Now we will manually compile the library and the final program. First let's compile the library, in command line run:

gcc -c -o library.o library.c

The -c flag tells the compiler to only compile the file, i.e. only generate the object (.o) file without trying to link it. After this command a file library.o should appear. Next we compile the main program in the same way:

gcc -c -o program.o program.c

This will generate the file program.o. Note that during this process the compiler is working only with the program.c file, it doesn't know the code of the function square, but it knows this function exists, what it returns and what parameter it has thanks to us including the library header library.h with #include "library.h" (quotes are used instead of < and > to tell the compiler to look for the files in the current directory).

Now we have the file program.o in which the compiled main function resides and file library.o in which the compiled function square resides. We need to link them together. This is done like this:

gcc -o program program.o library.o

For linking we don't need to use any special flag, the compiler knows that if we give it several .o files, it is supposed to link them. The file program should appear that we can already run and it should print

25

This is the principle of compiling multiple C files (and it also allows for combining C with other languages). This process is normally automated, but you should know how it works. The systems that automate this action are called build systems, they are for example Make and Cmake. When using e.g. the Make system, the whole codebase can be built with a single command make in the command line.

Some programmers simplify this whole process further so that they don't even need a build system, e.g. with so called header-only libraries, but this is outside the scope of this tutorial.

As a bonus, let's see a few useful compiler flags:

Advanced Data Types And Variables (Structs, Arrays, Strings)

Until now we've encountered simple data types such as int, char or float. These identify values which can take single atomic values (e.g. numbers or text characters). Such data types are called primitive types.

Above these there exist compound data types (also complex or structured) which are composed of multiple primitive types. They are necessary for any advanced program.

The first compound type is a structure, or struct. It is a collection of several values of potentially different data types (primitive or compound). The following code shows how a struc can be created and used.

#include <stdio.h>

typedef struct
{
  char initial; // initial of name
  int weightKg;
  int heightCm;
} Human;

int bmi(Human human)
{
  return (human.weightKg * 10000) / (human.heightCm * human.heightCm);
}

int main(void)
{
  Human carl;
  
  carl.initial = 'C';
  carl.weightKg = 100;
  carl.heightCm = 180;
  
  if (bmi(carl) > 25)
    puts("Carl is fat.");
    
  return 0;
}

The part of the code starting with typedef struct creates a new data type that we call Human (one convention for data type names is to start them with an uppercase character). This data type is a structure consisting of three members, one of type char and two of type int. Inside the main function we create a variable carl which is of Human data type. Then we set the specific values -- we see that each member of the struct can be accessed using the dot character (.), e.g. carl.weightKg; this can be used just as any other variable. Then we see the type Human being used in the parameter list of the function bmi, just as any other type would be used.

What is this good for? Why don't we just create global variables such as carl_initial, carl_weightKg and carl_heightCm? In this simple case it might work just as well, but in a more complex code this would be burdening -- imagine we wanted to create 10 variables of type Human (john, becky, arnold, ...). We would have to painstakingly create 30 variables (3 for each person), the function bmi would have to take two parameters (height and weight) instead of one (human) and if we wanted to e.g. add more information about every human (such as hairLength), we would have to manually create another 10 variables and add one parameter to the function bmi, while with a struct we only add one member to the struct definition and create more variables of type Human.

Structs can be nested. So you may see things such as myHouse.groundFloor.livingRoom.ceilingHeight in C code.

Another extremely important compound type is array -- a sequence of items, all of which are of the same data type. Each array is specified with its length (number of items) and the data type of the items. We can have, for instance, an array of 10 ints, or an array of 235 Humans. The important thing is that we can index the array, i.e. we access the individual items of the array by their position, and this position can be specified with a variable. This allows for looping over array items and performing certain operations on each item. Demonstration code follows:

#include <stdio.h>
#include <math.h> // for sqrt()

int main(void)
{
  float vector[5];
  
  vector[0] = 1;
  vector[1] = 2.5;
  vector[2] = 0;
  vector[3] = 1.1;
  vector[4] = -405.054; 
  
  puts("The vector is:");
  
  for (int i = 0; i < 5; ++i)
    printf("%lf ",vector[i]);
  
  putchar('\n'); // newline
  
  /* compute vector length with
     pythagoren theorem: */
  
  float sum = 0;
  
  for (int i = 0; i < 5; ++i)
    sum += vector[i] * vector[i];
  
  printf("Vector length is: %lf\n",sqrt(sum));
  
  return 0;
}

We've included a new library called math.h so that we can use a function for square root (sqrt). (If you have trouble compiling the code, add -lm flag to the compile command.)

float vector[5]; is a declaration of an array of length 5 whose items are of type float. When compiler sees this, it creates a continuous area in memory long enough to store 5 numbers of float type, the numbers will reside here one after another.

After doing this, we can index the array with square brackets ([ and ]) like this: ARRAY_NAME[INDEX] where ARRAY_NAME is the name of the array (here vector) and INDEX is an expression that evaluates to integer, starting with 0 and going up to the vector length minus one (remember that programmers count from zero). So the first item of the array is at index 0, the second at index 1 etc. The index can be a numeric constant like 3, but also a variable or a whole expression such as x + 3 * myFunction(). Indexed array can be used just like any other variable, you can assign to it, you can use it in expressions etc. This is seen in the example. Trying to access an item beyond the array's bounds (e.g. vector[100]) will likely crash your program.

Especially important are the parts of code staring with for (int i = 0; i < 5; ++i): this is an iteration over the array. It's a very common pattern that we use whenever we need to perform some action with every item of the array.

Arrays can also be multidimensional, but we won't bothered with that right now.

Why are arrays so important? They allow us to work with great number of data, not just a handful of numeric variables. We can create an array of million structs and easily work with all of them thanks to indexing and loops, this would be practically impossible without arrays. Imagine e.g. a game of chess; it would be very silly to have 64 plain variables for each square of the board (squareA1, squareA2, ..., squareH8), it would be extremely difficult to work with such code. With an array we can represent the square as a single array, we can iterate over all the squares easily etc.

One more thing to mention about arrays is how they can be passed to functions. A function can have as a parameter an array of fixed or unknown length. There is also one exception with arrays as opposed to other types: if a function has an array as parameter and the function modifies this array, the array passed to the function (the argument) will be modified as well (we say that arrays are passed by reference while other types are passed by value). We know this wasn't the case with other parameters such as int -- for these the function makes a local copy that doesn't affect the argument passed to the function. The following example shows what's been said:

#include <stdio.h>

// prints an int array of lengt 10
void printArray10(int array[10])
{
  for (int i = 0; i < 10; ++i)
    printf("%d ",array[i]);
}

// prints an int array of arbitrary lengt
void printArrayN(int array[], int n)
{
  for (int i = 0; i < n; ++i)
    printf("%d ",array[i]);
}

// fills an array with numbers 0, 1, 2, ...
void fillArrayN(int array[], int n)
{
  for (int i = 0; i < n; ++i)
    array[i] = i;
}

int main(void)
{
  int array10[10];
  int array20[20];
  
  fillArrayN(array10,10);
  fillArrayN(array20,20);
    
  printArray10(array10);
  putchar('\n');
  printArrayN(array20,20);
    
  return 0;
}

The function printArray10 has a fixed length array as a parameter (int array[10]) while printArrayN takes as a parameter an array of unknown length (int array[]) plus one additional parameter to specify this length (so that the function knows how many items of the array it should print). The function printArray10 is important because it shows how a function can modify an array: when we call fillArrayN(array10,10); in the main function, the array array10 will be actually modified after when the function finishes (it will be filled with numbers 0, 1, 2, ...). This can't be done with other data types (though there is a trick involving pointers which we will learn later).

Now let's finally talk about text strings. We've already seen strings (such as "hello"), we know we can print them, but what are they really? A string is a data type, and from C's point of view strings are nothing but arrays of chars (text characters), i.e. sequences of chars in memory. In C every string has to end with a 0 char -- this is NOT '0' (whose ASCII value is 48) but the direct value 0 (remember that chars are really just numbers). The 0 char cannot be printed out, it is just a helper value to terminate strings. So to store a string "hello" in memory we need an array of length at least 6 -- one for each character plus one for the terminating 0. These types of string are called zero terminated strings (or C strings).

When we write a string such as "hello" in our source, the C compiler creates an array in memory for us and fills it with characters 'h', 'e', 'l', 'l', 'o', 0. In memory this may look like a sequence of numbers 104, 101, 108, 108 111, 0.

Why do we terminate strings with 0? Because functions that work with strings (such as puts or printf) don't know what length the string is. We can call puts("abc"); or puts("abcdefghijk"); -- the string passed to puts has different length in each case, and the function doesn't know this length. But thanks to these strings ending with 0, the function can compute the length, simply by counting characters from the beginning until it finds 0 (or more efficiently it simply prints characters until it finds 0).

The syntax that allows us to create strings with double quotes (") is just a helper (syntactic sugar); we can create strings just as any other array, and we can work with them the same. Let's see an example:

#include <stdio.h>

int main(void)
{
  char alphabet[27]; // 26 places for letters + 1 for temrinating 0
  
  for (int i = 0; i < 26; ++i)
    alphabet[i] = 'A' + i;
  
  alphabet[26] = 0; // terminate the string
  
  puts(alphabet);
  
  return 0;
}

alphabet is an array of chars, i.e. a string. Its length is 27 because we need 26 places for letters and one extra space for the terminating 0. Here it's important to remind ourselves that we count from 0, so the alphabet can be indexed from 0 to 26, i.e. 26 is the last index we can use, doing alphabet[27] would be an error! Next we fill the array with letters (see how we can treat chars as numbers and do 'A' + i). We iterate while i < 26, i.e. we will fill all the places in the array up to the index 25 (including) and leave the last place (with index 26) empty for the terminating 0. That we subsequently assign. And finally we print the string with puts(alphabet) -- here note that there are no double quotes around alphabet because its a variable name. Doing puts("alphabet") would cause the program to literally print out alphabet. Now the program outputs:

ABCDEFGHIJKLMNOPQRSTUVWXYZ

In C there is a standard library for working with strings called string (#include <string.h>), it contains such function as strlen for computing string length or strcmp for comparing strings.

One final example -- a creature generator -- will show all the three new data types in action:

#include <stdio.h>
#include <stdlib.h> // for rand()

typedef struct
{
  char name[4]; // 3 letter name + 1 place for 0
  int weightKg;
  int legCount;
} Creature; // some weird creature

Creature creatures[100]; // global array of Creatures

void printCreature(Creature c)
{
  printf("Creature named %s ",c.name); // %s prints a string
  printf("(%d kg, ",c.weightKg);
  printf("%d legs)\n",c.legCount);
}

int main(void)
{
  // generate random creatures:
  
  for (int i = 0; i < 100; ++i)
  {
    Creature c;
    
    c.name[0] = 'A' + (rand() % 26);
    c.name[1] = 'a' + (rand() % 26);
    c.name[2] = 'a' + (rand() % 26);
    c.name[3] = 0; // terminate the string

    c.weightKg = 1 + (rand() % 1000); 
    c.legCount = 1 + (rand() % 10); // 1 to 10 legs

    creatures[i] = c;
  }
    
  // print the creatures:
  
  for (int i = 0; i < 100; ++i)
    printCreature(creatures[i]);
  
  return 0;
}

When run you will see a list of 100 randomly generated creatures which may start e.g. as:

Creature named Nwl (916 kg, 4 legs)
Creature named Bmq (650 kg, 2 legs)
Creature named Cda (60 kg, 4 legs)
Creature named Owk (173 kg, 7 legs)
Creature named Hid (430 kg, 3 legs)
...

Macros/Preprocessor

The C language comes with a feature called preprocessor which is necessary for some advanced things. It allows automatized modification of the source code before it is compiled.

Remember how we said that compiler compiles C programs in several steps such as generating object files and linking? There is one more step we didn't mention: preprocessing. It is the very first step -- the source code you give to the compiler first goes to the preprocessor which modifies it according to special commands in the source code called preprocessor directives. The result of preprocessing is a pure C code without any more preprocessing directives, and this is handed over to the actual compilation.

The preprocessor is like a mini language on top of the C language, it has its own commands and rules, but it's much more simple than C itself, for example it has no data types or loops.

Each directive begins with #, is followed by the directive name and continues until the end of the line (\ can be used to extend the directive to the next line).

We have already encountered one preprocessor directive: the #include directive when we included library header files. This directive pastes a text of the file whose name it is handed to the place of the directive.

Another directive is #define which creates so called macro -- in its basic form a macro is nothing else than an alias, a nickname for some text. This is used to create constants. Consider the following code:

#include <stdio.h>

#define ARRAY_SIZE 10

int array[ARRAY_SIZE];

void fillArray(void)
{
  for (int i = 0; i < ARRAY_SIZE; ++i)
    array[i] = i;
}

void printArray(void)
{
  for (int i = 0; i < ARRAY_SIZE; ++i)
    printf("%d ",array[i]);
}

int main()
{
  fillArray();
  printArray();
  return 0;
}

#define ARRAY_SIZE 10 creates a macro that can be seen as a constant named ARRAY_SIZE which stands for 10. From this line on any occurence of ARRAY_SIZE that the preprocessor encounters in the code will be replaced with 10. The reason for doing this is obvious -- we respect the DRY (don't repeat yourself) principle, if we didn't use a constant for the array size and used the direct numeric value 10 in different parts of the code, it would be difficult to change them all later, especially in a very long code, there's a danger we'd miss some. With a constant it is enough to change one line in the code (e.g. #define ARRAY_SIZE 10 to #define ARRAY_SIZE 20).

The macro substitution is literally a copy-paste text replacement, there is nothing very complex going on. This means you can create a nickname for almost anything (for example you could do #define when if and then also use when in place of if -- but it's probably not a very good idea). By convention macro names are to be ALL_UPPER_CASE (so that whenever you see an all upper case word in the source code, you know it's a macro).

Macros can optionally take parameters similarly to functions. There are no data types, just parameter names. The usage is demonstrated by the following code:

#include <stdio.h>

#define MEAN3(a,b,c) (((a) + (b) + (c)) / 3) 

int main()
{
  int n = MEAN3(10,20,25);
  
  printf("%d\n",n);
    
  return 0;
}

MEAN3 computes the mean of 3 values. Again, it's just text replacement, so the line int n = MEAN3(10,20,25); becomes int n = (((10) + (20) + (25)) / 3); before code compilation. Why are there so many brackets in the macro? It's always good to put brackets over a macro and all its parameters because the parameters are again a simple text replacement; consider e.g. a macro #define HALF(x) x / 2 -- if it was invoked as HALF(5 + 1), the substitution would result in the final text 5 + 1 / 2, which gives 5 (instead of the intended value 3).

You may be asking why would we use a macro when we can use a function for computing the mean? Firstly macros don't just have to work with numbers, they can be used to generate parts of the source code in ways that functions can't. Secondly using a macro may sometimes be simpler, it's shorter and will be faster to execute because the is no function call (which has a slight overhead) and because the macro expansion may lead to the compiler precomputing expressions at compile time. But beware: macros are usually worse than functions and should only be used in very justified cases. For example macros don't know about data types and cannot check them, and they also result in a bigger compiled executable (function code is in the executable only once whereas the macro is expanded in each place where it is used and so the code it generates multiplies).

Another very useful directive is #if for conditional inclusion or exclusion of parts of the source code. It is similar to the C if command. The following example shows its use:

#include <stdio.h>

#define RUDE 0

void printNumber(int x)
{
  puts(
#if RUDE
    "You idiot, the number is:"
#else
    "The number is:"
#endif
  );
  
  printf("%d\n",x);
}

int main()
{
  printNumber(3);
  printNumber(100);
  
#if RUDE
  puts("Bye bitch.");
#endif
    
  return 0;
}

When run, we get the output:

The number is:
3
The number is:
100

And if we change #define RUDE 0 to #define RUDE 1, we get:

You idiot, the number is:
3
You idiot, the number is:
100
Bye bitch.

We see the #if directive has to have a corresponding #endif directive that terminates it, and there can be an optional #else directive for an else branch. The condition after #if can use similar operators as those in C itself (+, ==, &&, || etc.). There also exists an #ifdef directive which is used the same and checks if a macro of given name has been defined.

#if directives are very useful for conditional compilation, they allow for creation of various "settings" and parameters that can fine-tune a program -- you may turn specific features on and off with this directive. It is also helpful for portability; compilers may automatically define specific macros depending on the platform (e.g. _WIN64, __APPLE__, ...) based on which you can trigger different code. E.g.:

#ifdef _WIN64
  puts("Your OS sucks.");
#endif

Let us talk about one more thing that doesn't fall under the preprocessor language but is related to constants: enumerations. Enumeration is a data type that can have values that we specify individually, for example:

typedef enum
{
  APPLE,
  PEAR,
  TOMATO
} Fruit;

This creates a new data type Fruit. Variables of this type may have values APPLE, PEAR or TOMATO, so we may for example do Fruit myFruit = APPLE;. These values are in fact integers and the names we give them are just nicknames, so here APPLE is equal to 0, PEAR to 1 and TOMATO to 2.

Pointers

Pointers are an advanced topic that many people fear -- many complain they're hard to learn, others complain about memory unsafety and potential dangers of using pointers. These people are stupid, pointers are great.

But beware, there may be too much new information in the first read. Don't get scared, give it some time.

Pointers allow us to do certain advanced things such as allocate dynamic memory, return multiple values from functions, inspect content of memory or use functions in similar ways in which we use variables.

A pointer is essentially nothing complicated: it is a data type that can hold a memory address (plus an information about what data type should be stored at that address). An address is simply a number. Why can't we just use an int to store a memory address? Because the size of int and a pointer may differ, the size of pointer depends on each platform's address width. Besides this, as said, a pointer actually holds not only an address but also the information about the type it points to, which is a safety mechanism that will become clear later. It is also good when the compiler knows a certain variable is supposed to point to a memory rather than to hold a generic number -- this can all prevent bugs. I.e. pointers and generic integers are distinguished for the same reason other data types are distinguished -- in theory they don't have to be distinguished, but it's safer.

It is important to stress again that a pointer is not a pure address but it also knows about the data type it is pointing to, so there are many kinds of pointers: a pointer to int, a pointer to char, a pointer to a specific struct type etc.

A variable of pointer type is created similarly to a normal variable, we just add * after the data type, for example int *x; creates a variable named x that is a pointer to int (some people would write this as int* x;).

But how do we assign a value to the pointer? To do this, we need an address of something, e.g. of some variable. To get an address of a variable we use the & character, i.e. &a is the address of a variable a.

The last basic thing we need to know is how to dereference a pointer. Dereferencing means accessing the value at the address that's stored in the pointer, i.e. working with the pointed to value. This is again done (maybe a bit confusingly) with * character in front of a pointer, e.g. if x is a pointer to int, *x is the int value to which the pointer is pointing. An example can perhaps make it clearer.

#include <stdio.h>

int main(void)
{
  int normalVariable = 10;
  int *pointer;
  
  pointer = &normalVariable;
  
  printf("address in pointer: %p\n",pointer);
  printf("value at this address: %d\n",*pointer);
  
  *pointer = *pointer + 10;
  
  printf("normalVariable: %d\n",normalVariable);
  
  return 0;
}

This may print e.g.:

address in pointer: 0x7fff226fe2ec
value at this address: 10
normalVariable: 20

int *pointer; creates a pointer to int with name pointer. Next we make the pointer point to the variable normalVariable, i.e. we get the address of the variable with &normalVariable and assign it normally to pointer. Next we print firstly the address in the pointer (accessed with pointer) and the value at this address, for which we use dereference as *pointer. At the next line we see that we can also use dereference for writing to the pointed address, i.e. doing *pointer = *pointer + 10; here is the same as doing normalVariable = normalVariable + 10;. The last line shows that the value in normalVariable has indeed changed.

IMPORTANT NOTE: You generally cannot read and write from/to random addresses! This will crash your program. To be able to write to a certain address it must be allocated, i.e. reserved for use. Addresses of variables are allocated by the compiler and can be safely operated with.

There's a special value called NULL (a macro defined in the standard library) that is meant to be assigned to pointer that points to "nothing". So when we have a pointer p that's currently not supposed to point to anything, we do p = NULL;. In a safe code we should always check (with if) whether a pointer is not NULL before dereferencing it, and if it is, then NOT dereference it. This isn't required but is considered a "good practice" in safe code, storing NULL in pointers that point nowhere prevents dereferencing random or unallocated addresses which would crash the program.

But what can pointers be good for? Many things, for example we can kind of "store variables in variables", i.e. a pointer is a variable which says which variable we are now using, and we can switch between variable any time. E.g.:

#include <stdio.h>

int bankAccountMonica = 1000;
int bankAccountBob = -550;
int bankAccountJose = 700;

int *payingAccount; // pointer to who's currently paying

void payBills(void)
{
  *payingAccount -= 200;
}

void buyFood(void)
{
  *payingAccount -= 50;
}

void buyGas(void)
{
  *payingAccount -= 20;
}

int main(void)
{
  // let Jose pay first
  
  payingAccount = &bankAccountJose;
  
  payBills();
  buyFood();
  buyGas();
    
  // that's enough, now let Monica pay 

  payingAccount = &bankAccountMonica;

  buyFood();
  buyGas();
  buyFood();
  buyFood();
    
  // now it's Bob's turn
  
  payingAccount = &bankAccountBob;
  
  payBills();
  buyFood();
  buyFood();
  buyGas();
    
  printf("Monika has $%d left.\n",bankAccountMonica);
  printf("Jose has $%d left.\n",bankAccountJose);
  printf("Bob has $%d left.\n",bankAccountBob);
    
  return 0;
}

Well, this could be similarly achieved with arrays, but pointers have more uses. For example they allow us to return multiple values by a function. Again, remember that we said that (with the exception of arrays) a function cannot modify a variable passed to it because it always makes its own local copy of it? We can bypass this by, instead of giving the function the value of the variable, giving it the address of the variable. The function can read the value of that variable (with dereference) but it can also CHANGE the value, it simply writes a new value to that address (again, using dereference). This example shows it:

#include <stdio.h>
#include <math.h>

#define PI 3.141592

// returns 2D coordinates of a point on a unit circle
void getUnitCirclePoint(float angle, float *x, float *y)
{
  *x = sin(angle);
  *y = cos(angle);
}

int main(void)
{
  for (int i = 0; i < 8; ++i)
  {
    float pointX, pointY;
    
    getUnitCirclePoint(i * 0.125 * 2 * PI,&pointX,&pointY);
    
    printf("%lf %lf\n",pointX,pointY);
  }
    
  return 0;
}

Function getUnitCirclePoint doesn't return any value in the strict sense, but thank to pointers it effectively returns two float values via its parameters x and y. These parameters are of the data type pointer to float (as there's * in front of them). When we call the function with getUnitCirclePoint(i * 0.125 * 2 * PI,&pointX,&pointY);, we hand over the addresses of the variables pointX and pointY (which belong to the main function and couldn't normally be accessed in getUnitCirclePoint). The function can then compute values and write them to these addresses (with dereference, *x and *y), changing the values in pointX and pointY, effectively returning two values.

Now let's take a look at pointers to structs. Everything basically works the same here, but there's one thing to know about, a syntactic sugar known as an arrow (->). Example:

#include <stdio.h>

typedef struct
{
  int a;
  int b;
} SomeStruct;

SomeStruct s;
SomeStruct *sPointer;

int main(void)
{
  sPointer = &s;
  
  (*sPointer).a = 10; // without arrow
  sPointer->b = 20;   // same as (*sPointer).b = 20
    
  printf("%d\n",s.a);
  printf("%d\n",s.b);
    
  return 0;
}

Here we are trying to write values to a struct through pointers. Without using the arrow we can simply dereference the pointer with *, put brackets around and access the member of the struct normally. This shows the line (*sPointer).a = 10;. Using an arrow achieves the same thing but is perhaps a bit more readable, as seen in the line sPointer->b = 20;. The arrow is simply a special shorthand and doesn't need any brackets.

Now let's talk about arrays -- these are a bit special. The important thing is that an array is itself basically a pointer. What does this mean? If we create an array, let's say int myArray[10];, then myArray is basically a pointer to int in which the address of the first array item is stored. When we index the array, e.g. like myArray[3] = 1;, behind the scenes there is basically a dereference because the index 3 means: 3 places after the address pointed to by myArray. So when we index an array, the compiler takes the address stored in myArray (the address of the array start) and adds 3 to it (well, kind of) by which it gets the address of the item we want to access, and then dereferences this address.

Arrays and pointer are kind of a duality -- we can also use array indexing with pointers. For example if we have a pointer declared as int *x;, we can access the value x points to with a dereference (*x), but ALSO with indexing like this: x[0]. Accessing index 0 simply means: take the value stored in the variable and add 0 to it, then dereference it. So it achieves the same thing. We can also use higher indices (e.g. x[10]), BUT ONLY if x actually points to a memory that has at least 11 allocated places.

This leads to a concept called pointer arithmetic. Pointer arithmetic simply means we can add or subtract numbers to pointer values. If we continue with the same pointer as above (int *x;), we can actually add numbers to it like *(x + 1) = 10;. What does this mean?! It means exactly the same thing as x[1]. Adding a number to a pointer shifts that pointer given number of places forward. We use the word places because each data type takes a different space in memory, for example char takes one byte of memory while int takes usually 4 (but not always), so shifting a pointer by N places means adding N times the size of the pointed to data type to the address stored in the pointer.

This may be a lot information to digest. Let's provide an example to show all this in practice:

#include <stdio.h>

// our own string print function
void printString(char *s)
{
  int position = 0;
  
  while (s[position] != 0)
  {
    putchar(s[position]);
    position += 1;
  }
}

// returns the length of string s
int stringLength(char *s)
{
  int length = 0;
    
  while (*s != 0) // count until terminating 0
  {
    length += 1;
    s += 1; // shift the pointer one character to right
  }
  
  return length;
}

int main(void)
{
  char testString[] = "catdog";
  
  printString("The string '");
  printString(testString);
  printString("' has length ");
  
  int l = stringLength(testString);
  
  printf("%d.",l);

  return 0;
}

The output is:

The string 'catdog' has length 6.

We've created a function for printing strings (printString) similar to puts and a function for computing the length of a string (stringLength). They both take as an argument a pointer to char, i.e. a string. In printString we use indexing ([ and ]) just as if s was an array, and indeed we see it works! In stringLength we similarly iterate over all characters in the string but we use dereference (*s) and pointer arithmetic (s += 1;). It doesn't matter which of the two styles we choose -- here we've shown both, for educational purposes. Finally notice that the string we actually work with is created in main as an array with char testString[] = "catdog"; -- here we don't need to specify the array size between [ and ] because we immediately assign a string literal to it ("catdog") and in such a case the compiler knows how big the array needs to be and automatically fills in the correct size.

Now that know about pointers, we can finally completely explain the functions from stdio we've been using:

Files

Now we'll take a look at how we can read and write from/to files on the computer disk which enables us to store information permanently or potentially process data such as images or audio. Files aren't so difficult.

We work with files through functions provided in the stdio library (so it has to be included). We distinguish two types of files:

From programmer's point of view there's actually not a huge difference between the two, they're both just sequences of characters or bytes (which are kind of almost the same). Text files are a little more abstract, they handle potentially different format of newlines etc. The main thing for us is that we'll use slightly different functions for each type.

There is a special data type for file called FILE (we'll be using a pointer to it). Whatever file we work with, we need to firstly open it with the function fopen and when we're done with it, we need to close it with a function fclose.

First we'll write something to a text file:

#include <stdio.h>

int main(void)
{
  FILE *textFile = fopen("test.txt","w"); // "w" for write

  if (textFile != NULL) // if opened successfully
    fprintf(textFile,"Hello file.");
  else
    puts("ERROR: Couldn't open file.");

  fclose(textFile);

  return 0;
}

When run, the program should create a new file named test.txt in the same directory we're in and in it you should find the text Hello file.. FILE *textFile creates a new variable textFile which is a pointer to the FILE data type. We are using a pointer simply because the standard library is designed this way, its functions work with pointers (it can be more efficient). fopen("test.txt","w"); attempts to open the file test.txt in text mode for writing -- it returns a pointer that represents the opened file. The mode, i.e. text/binary, read/write etc., is specified by the second argument: "w"; w simply specifies write and the text mode is implicit (it doesn't have to be specified). if (textFile != NULL) checks if the file has been successfully opened; the function fopen returns NULL (the value of "point to nothing" pointers) if there was an error with opening the file (such as that the file doesn't exist). On success we write text to the file with a function fprintf -- it's basically the same as printf but works on files, so it's first parameter is always a pointer to a file to which it should write. You can of course also print numbers and anything that printf can with this function. Finally we mustn't forget to close the file at the end with fclose!

Now let's write another program that reads the file we've just created and writes its content out in the command line:

#include <stdio.h>

int main(void)
{
  FILE *textFile = fopen("test.txt","r"); // "r" for read

  if (textFile != NULL) // if opened successfully
  {
    char c;

    while (fscanf(textFile,"%c",&c) != EOF) // while not end of file
      putchar(c);
  }
  else
    puts("ERROR: Couldn't open file.");

  fclose(textFile);

  return 0;
}

Notice that in fopen we now specify "r" (read) as a mode. Again, we check if the file has been opened successfully (if (textFile != NULL)). If so, we use a while loop to read and print all characters from the file until we encounter the end of file. The reading of file characters is done with the fscanf function inside the loop's condition -- there's nothing preventing us from doing this. fscanf again works the same as scanf (so it can read other types than only chars), just on files (its first argument is the file to read from). On encountering end of file fscanf returns a special value EOF (which is macro constant defined in the standard library). Again, we must close the file at the end with fclose.

We will now write to a binary file:

#include <stdio.h>

int main(void)
{
  unsigned char image[] = // image in ppm format
  { 
    80, 54, 32, 53, 32, 53, 32, 50, 53, 53, 32,
    255,255,255, 255,255,255, 255,255,255, 255,255,255, 255,255,255,
    255,255,255,    0, 0,  0, 255,255,255,   0,  0,  0, 255,255,255,
    255,255,255, 255,255,255, 255,255,255, 255,255,255, 255,255,255,
      0,  0,  0, 255,255,255, 255,255,255, 255,255,255,   0,  0,  0,
    255,255,255,   0,  0,  0,   0,  0,  0,   0,  0,  0, 255,255,255  
  };

  FILE *binFile = fopen("image.ppm","wb");

  if (binFile != NULL) // if opened successfully
    fwrite(image,1,sizeof(image),binFile);
  else
    puts("ERROR: Couldn't open file.");

  fclose(binFile);

  return 0;
}

Okay, don't get scared, this example looks complex because it is trying to do a cool thing: it creates an image file! When run, it should produce a file named image.ppm which is a tiny 5x5 smiley face image in ppm format. You should be able to open the image in any good viewer (I wouldn't bet on Windows programs though). The image data was made manually and are stored in the image array. We don't need to understand the data, we just know we have some data we want to write to a file. Notice how we can manually initialize the array with values using { and } brackets. We open the file for writing and in binary mode, i.e. with a mode "wb", we check the success of the action and then write the whole array into the file with one function call. The function is name fwrite and is used for writing to binary files (as opposed to fprintf for text files). fwrite takes these parameters: pointer to the data to be written to the file, size of one data element (in bytes), number of data elements and a pointer to the file to write to. Our data is the image array and since "arrays are basically pointers", we provide it as the first argument. Next argument is 1 (unsigned char always takes 1 byte), then length of our array (sizeof is a special operator that substitutes the size of a variable in bytes -- since each item in our array takes 1 byte, sizeof(image) provides the number of items in the array), and the file pointer. At the end we close the file.

And finally we'll finish with reading this binary file back:

#include <stdio.h>

int main(void)
{
  FILE *binFile = fopen("image.ppm","rb");

  if (binFile != NULL) // if opened successfully
  {
    unsigned char byte;

    while (fread(&byte,1,1,binFile))
      printf("%d ",byte);

    putchar('\n');
  }
  else
    puts("ERROR: Couldn't open file.");

  fclose(binFile);

  return 0;
}

The file mode is now "rb" (read binary). For reading from binary files we use the fread function, similarly to how we used fscanf for reading from a text file. fread has these parameters: pointer where to store the read data (the memory must have sufficient space allocated!), size of one data item, number of items to read and the pointer to the file which to read from. As the first argument we pass &byte, i.e. the address of the variable byte, next 1 (we want to read a single byte whose size in bytes is 1), 1 (we want to read one byte) and the file pointer. fread returns the number of items read, so the while condition holds as long as fread reads bytes; once we reach end of file, fread can no longer read anything and returns 0 (which in C is interpreted as a false value) and the loop ends. Again, we must close the file at the end.

More On Functions (Recursion, Function Pointers)

There's more to be known about functions.

An important concept in programming is recursion -- the situation in which a function calls itself. Yes, it is possible, but some rules have to be followed.

When a function calls itself, we have to ensure that we won't end up in infinite recursion (i.e. the function calls itself which subsequently calls itself and so on until infinity). This crashes our program. There always has to be a terminating condition in a recursive function, i.e. an if branch that will eventually stop the function from calling itself again.

But what is this even good for? Recursion is actually very common in math and programming, many problems are recursive in nature. Many things are beautifully described with recursion (e.g. fractals). But remember: anything a recursion can achieve can also be achieved by iteration (loop) and vice versa. It's just that sometimes one is more elegant or more computationally efficient.

Let's see this on a typical example of the mathematical function called factorial. Factorial of N is defined as N x (N - 1) x (N - 2) x ... x 1. It can also be defined recursively as: factorial of N is 1 if N is 0, otherwise N times factorial of N - 1. Here is some code:

#include <stdio.h>

unsigned int factorialRecursive(unsigned int x)
{
  if (x == 0) // terminating condition
    return 1;
  else
    return x * factorialRecursive(x - 1);
}

unsigned int factorialIterative(unsigned int x)
{
  unsigned int result = 1;
    
  while (x > 1)
  {
    result *= x;
    x--;  
  }
  
  return result;
}

int main(void)
{
  printf("%d %d\n",factorialRecursive(5),factorialIterative(5));
  return 0;
}

factorialIterative computes the factorial by iteration. factorialRecursive uses recursion -- it calls itself. The important thing is the recursion is guaranteed to end because every time the function calls itself, it passes a decremented argument so at one point the function will receive 0 in which case the terminating condition (if (x == 0)) will be triggered which will avoid the further recursive call.

It should be mentioned that performance-wise recursion is almost always worse than iteration (function calls have certain overhead), so in practice it is used sparingly. But in some cases it is very well justified (e.g. when it makes code much simpler while creating unnoticeable performance loss).

Another thing to mention is that we can have pointers to functions; this is an advanced topic so we'll stay at it just briefly. Function pointers are pretty powerful, they allow us to create so called callbacks: imagine we are using some GUI framework and we want to tell it what should happen when a user clicks on a specific button -- this is usually done by giving the framework a pointer to our custom function that it will be called by the framework whenever the button is clicked.

Dynamic Allocation (Malloc)

Dynamic memory allocation means the possibility of reserving additional memory (RAM) for our program at run time, whenever we need it. This is opposed to static memory allocation, i.e. reserving memory for use at compile time (when compiling, before the program runs). We've already been doing static allocation whenever we created a variable -- compiler automatically reserves as much memory for our variables as is needed. But what if we're writing a program but don't yet know how much memory it will need? Maybe the program will be reading a file but we don't know how big that file is going to be -- how much memory should we reserve? Dynamic allocation allows us to reserve this memory with functions when the program is actually runing and already knows how much of it should be reserved.

It must be known that dynamic allocation comes with a new kind of bug known as a memory leak. It happens when we reserve a memory and forget to free it after we no longer need it. If this happens e.g. in a loop, the program will continue to "grow", eat more and more RAM until operating system has no more to give. For this reason, as well as others such as simplicity, it may sometimes be better to go with only static allocation.

Anyway, let's see how we can allocate memory if we need to. We use mostly just two functions that are provided by the stdlib library. One is malloc which takes as an argument size of the memory we want to allocate (reserve) in bytes and returns a pointer to this allocated memory if successful or NULL if the memory couldn't be allocated (which in serious programs we should always check). The other function is free which frees the memory when we no longer need it (every allocated memory should be freed at some point) -- it takes as the only parameter a pointer to the memory we've previously allocated. There is also another function called realloc which serves to change the size of an already allocated memory: it takes a pointer the the allocated memory and the new size in byte, and returns the pointer to the resized memory.

Here is an example:

#include <stdio.h>
#include <stdlib.h>

#define ALLOCATION_CHUNK 32 // by how many bytes to resize

int main(void)
{
  int charsRead = 0;
  int resized = 0; // how many times we called realloc
  char *inputChars = malloc(ALLOCATION_CHUNK * sizeof(char)); // first allocation

  while (1) // read input characters
  {
    char c = getchar();
 
    charsRead++;

    if ((charsRead % ALLOCATION_CHUNK) == 0)
    {
      inputChars = // we need more space, resize the array
        realloc(inputChars,(charsRead / ALLOCATION_CHUNK + 1) * ALLOCATION_CHUNK * sizeof(char));
        
      resized++;
    }

    inputChars[charsRead - 1] = c;
   
    if (c == '\n')
    {
      charsRead--; // don't count the last newline character
      break;
    }
  }
  
  puts("The string you entered backwards:");
  
  while (charsRead > 0)
  {
    putchar(inputChars[charsRead - 1]);
    charsRead--;
  }

  free(inputChars); // important!
  
  putchar('\n');
  printf("I had to resize the input buffer %d times.",resized);
    
  return 0;
}

This code reads characters from the input and stores them in an array (inputChars) -- the array is dynamically resized if more characters are needed. (We restrain from calling the array inputChars a string because we never terminate it with 0, we couldn't print it with standard functions like puts.) At the end the entered characters are printed backwards (to prove we really stored all of them), and we print out how many times we needed to resize the array.

We define a constant (macro) ALLOCATION_CHUNK that says by how many characters we'll be resizing our character buffer. I.e. at the beginning we create a character buffer of size ALLOCATION_CHUNK and start reading input character into it. Once it fills up we resize the buffer by another ALLOCATION_CHUNK characters and so on. We could be resizing the buffer by single characters but that's usually inefficient (the function malloc may be quite complex and take some time to execute).

The line starting with char *inputChars = malloc(... creates a pointer to char -- our character buffer -- to which we assign a chunk of memory allocated with malloc. Its size is ALLOCATION_CHUNK * sizeof(char). Note that for simplicity we don't check if inputChars is not NULL, i.e. whether the allocation succeeded -- but in your program you should do it :) Then we enter the character reading loop inside which we check if the buffer has filled up (if ((charsRead % ALLOCATION_CHUNK) == 0)). If so, we used the realloc function to increase the size of the character buffer. The important thing is that once we exit the loop and print the characters stored in the buffer, we free the memory with free(inputChars); as we no longer need it.

Debugging, Optimization

Debugging means localizing and fixing bugs (errors) in your program. In practice there are always bugs, even in very short programs (you've probably already figured that out yourself), some small and insignificant and some pretty bad ones that make your program unusable, vulnerable or even dangerous.

There are two kinds of bugs: syntactic errors and semantic errors. A syntactic error is when you write something not obeying the C grammar, it's like a typo or grammatical error in a normal language -- these errors are very easy to detect and fix, a compiler won't be able to understand your program and will point you to the exact place where the error occurs. A semantic error can be much worse -- it's a logical error in the program; the program will compile and run but the program will behave differently than intended. The program may crash, leak memory, give wrong results, run slowly, corrupt files etc. These errors may be hard to spot and fix, especially when they happen in rare situations. We'll be only considering semantic errors from now on.

If we spot a bug, how do we fix it? The first thing is to find a way to replicate it, i.e. find the exact steps we need to make with the program to make the bug appear (e.g. "in the menu press keys A and B simultaneously", ...). Next we need to trace and locate which exact line or piece of code is causing the bug. This can either be done with the help of specialized debuggers such as gdb or valgrind, but there's usually a much easier way of using printing functions such as printf. (Still do check out the above mentioned debuggers, they're very helpful.)

Let's say your program crashes and you don't know at which line. You simply put prints such as printf("A\n"); and printf("B\n); at the beginning and end of a code you suspect might be causing the crash. Then you run the program: if A is printed but B isn't, you know the crash happened somewhere between the two prints, so you shift the B print a little bit up and so on until you find exactly after which line B stops printing -- this is the line that crashes the program. IMPORTANT NOTE: the prints have to have newline (\n) at the end, otherwise this method may not work because of output buffering.

Of course, you may use the prints in other ways, for example to detect at which place a value of variable changes to a wrong value. (Asserts are also good for keeping an eye on correct values of variables.)

What if the program isn't exactly crashing but is giving wrong results? Then you need to trace the program step by step (not exactly line by line, but maybe function by function) and check which step has a problem in it. If for example your game AI is behaving stupid, you firstly check (with prints) if it correctly detects its circumstances, then you check whether it makes the correct decision based on the circumstances, then you check whether the pathfinding algorithm finds the correct path etc. At each step you need to know what the correct behavior should be and you try to find where the behavior is broken.

Knowing how to fix a bug isn't everything, we also need to find the bugs in the first place. Testing is the process of trying to find bugs by simply running and using the program. Remember, testing can't prove there are no bugs in the program, it can only prove bugs exist. You can do testing manually or automate the tests. Automated tests are very important for preventing so called regressions (so the tests are called regression tests). Regression happens when during further development you break some of its already working features (it is very common, don't think it won't be happening to you). Regression test (which can simply be just a normal C program) simply automatically checks whether the already implemented functions still give the same results as before (e.g. if sin(0) = 0 etc.). These tests should be run and pass before releasing any new version of the program (or even before any commit of new code).

Optimization is also a process of improving an already working program, but here we try to make the program more efficient -- the most common goal is to make the program faster, smaller or consume less RAM. This can be a very complex task, so we'll only mention it briefly.

The very basic thing we can do is to turn on automatic optimization with a compiler flag: -O3 for speed, -Os for program size (-O2 and -O1 are less aggressive speed optimizations). Yes, it's that simple, you simply add -O3 and your program gets magically faster. Remember that optimizations against different resources are often antagonistic, i.e. speeding up your program typically makes it consume more memory and vice versa. You need to choose. Optimizing manually is a great art. Let's suppose you are optimizing for speed -- the first, most important thing is to locate the part of code that's slowing down you program the most, so called bottleneck. That is the code you want to make faster. Trying to optimize non-bottlenecks doesn't speed up your program as a whole much; imagine you optimize a part of the code that takes 1% of total execution time by 200% -- your program will only get 0.5% faster. Bottlenecks can be found using profiling -- measuring the execution time of different parts of the program (e.g. each function). This can be done manually or with tools such a gprof. Once you know where to optimize, you try to apply different techniques: using algorithms with better time complexity, using look up tables, optimizing cache behavior and so on. This is beyond the scope of this tutorial.

Final Program

TODO

Where To Go Next

We haven't nearly covered the whole of C, but you should have pretty solid basics now. Now you just have to go and write a lot of C programs, that's the only way to truly master C. WARNING: Do not start with an ambitious project such as a 3D game. You won't make it and you'll get demotivated. Start very simple (a Tetris clone perhaps?).

You should definitely learn about common data structures (linked lists, binary trees, hash tables, ...) and algorithms (sorting, searching, ...). As an advanced programmer you should definitely know a bit about memory management. Also take a look at basic licensing. Another thing to learn is some version control system, preferably git, because this is how we manage bigger programs and how we collaborate on them. To start making graphical programs you should get familiar with some library such as SDL.

A great amount of experience can be gained by contributing to some existing project, collaboration really boosts your skill and knowledge of the language. This should only be done when you're at least intermediate. Firstly look up a nice project on some git hosting site, then take a look at the bug tracker and pick a bug or feature that's easy to fix or implement (low hanging fruit).


culture

Culture

TODO

Culture is more important than legislation and judiciary ("official laws") as culture is the strongest force defining how we live the majority of our lives, what actions we take and how they are judged by others; courts, police and prison only step in in absolute extreme cases. You do many more things (such as eating meat, cutting your hair, watching TV or living with a single partner) because of culture, not because you are obliged by law. There aren't even enough policemen to guarantee law enforcement in all cases and all states rely (by basically not even having any other choice) on culture doing most of the job in keeping society working (which is also exploited by states and corporations when they try to manipulate culture with propaganda rather than changing laws). Consider for example that you download a random photo from the internet and set it as a wallpaper on your computer -- officially you have committed a crime of piracy as you had no rights for downloading the image, however culturally no one sees this as harmful, no one is going to bully you, sue you and even if someone tried to sue you, no judge would actually punish such a laughable "crime". On the other hand if you do a legal but culturally unacceptable thing, such as making a public art exhibition of non-sexual photos of naked children (also notice this might have been culturally OK to do in the previous century, but not now), you will be culturally punished by everyone distancing themselves from you and someone perhaps even illegally physically attacking you. A sentence, such as "black people aren't as intelligent as white people", spoken half a century ago may nowadays be judged by a court in a much different way just by the context of today's culture and even under the same set of laws in the past you would not have been convicted of a crime while nowadays you would, as legal terms are eventually at some level defined in a plain language, which is permeated by culture. Therefore in trying to change society we should remember two things:

  1. Focus on laws is a short term necessary evil.
  2. Focus on culture (and eventual elimination of law as such) is our long term focus.

Types Of Culture

Here are some notable, named cultural patterns:


data_hoarding

Data Hoarding

TODO: is it based or is it a disease? Hoarding data on paper (books, encyclopedias, ...) is probably good.


data_structure

Data Structure

Data structure refers to a any specific way in which data is organized in computer memory. A specific data structure describes such things as order, relationships (interconnection, hierarchy, ...), formats and types of parts of the data. Programming is sometimes seen as consisting mainly of two things: design of algorithms and data structures these algorithm work with.

As a programmer dealing with a specific problem you oftentimes have a choice of multiple data structures -- choosing the right one is essential for performance and efficiency of your program. As with everything, each data structure has advantages and also its downsides; some are faster, some take less memory etc. For example for a searchable database of text string we can be choosing between a binary tree and a hash table; hash table offers theoretically much faster search, but binary trees may be more memory efficient and offer many other efficient operations like range search and sorting (which hash tables can do but very inefficiently).

Specific Data Structures

These are just some common ones:

See Also


deep_blue

Deep Blue

Deep Blue was a legendary chess playing IBM supercomputer, which in 1997 made history by being the first ever computer to beat the human world chess champion at the time (Garry Kasparov), marking a moment which many consider that at which "computers finally outsmarted humans". Since then computers really did continue to surpass humans at chess by much greater margins; nowadays a mere cellphone running stockfish can easily rape the world chess champion.

History: it all started around 1985 as a program called ChipTest by some Taiwanese guy with unpronounceable name. It went on to win some computer chess tournaments and when multiple people were already working on it as a part of IBM research, it was renamed to Deep Thought after the computer in Hitchhiker's Guide to the Galaxy, however it had to be later renamed to Deep Blue because the name too much resembled deep throat :D By 1990 it has already played the world champion, Kasparov, but lost. In 1996 Deep Blue played him again, this time losing the match again but already having won a game, showing the potential was there. In May 1997, after upgrade both in hardware and software, it finally beat Kasparov with 3 wins, 2 losses and 1 draw.

{ Lol, according to Wikipedoa it trolled Kasparov in the first game by making a completely random move due to a bug once, it scared him because he thought it was some deeply calculated threat while it was just some completely dumb move. ~drummyfish }

It's important to see that Deep Blue wasn't really a general chess engine like stockfish, it was a single purpose supercomputer, a combination of hardware and software engineered from the ground up with the single purpose: win the match against Garry Kasparov. It was being fine tuned in between the games with assistance of grandmasters. A team of experts on computers and chess focused their efforts on this single opponent at the specific time controls and match set up, rather than trying to make a generally usable chess computer. They studied Kasparov's play and made Deep Blue ready for it; they even employed psychological tricks -- for example it had preprogrammed instant responses to some Kasparov's expected moves, so as to make him more nervous.

Technical details: Deep Blue was mainly relying on massively parallel brute force, i.e. looking many moves ahead and consulting stored databases of games; in 1997 it had some 11 GFLOPS. The base computer was IBM RS/6000 SP (taking two cabinets) with IBM AIX operating system, using 32 PowerPC 200 MHz processors and 480 specialized "chess chips". It had evaluation function implemented in hardware. All in all the whole system could search hundreds of millions positions per second. Non-extended search was performed to a depth of about 12 plies, extended search went even over 40 plies deep. It had an opening book with about 4000 positions and endgame tablebases for up to 6 pieces. It was programmed in C. { Sources seems to sometimes give different numbers and specs, so not exactly sure about this. ~drummyfish }

See Also


de_facto

De Facto

De facto is Latin for "in fact" or "by facts", it means that something holds in practice; it is contrasted with de jure ("by law"). We use the term to say whether something is actually true in reality as opposed to "just on paper".

For example in technology a so called de facto standard is something that, without it being officially formalized or forced by law in prior, most developers naturally come to adopt so as to keep compatibility; for example the Markdown format has become the de facto standard for READMEs in FOSS development. Of course it happens often that de facto standards are later made into official standards. On the other hand there may be standards that are created by official standardizing authorities, such as the state, which however fail to gain wide adoption in practice -- these are official standards but not de facto one. TODO: example? :)

Regarding politics and society, we often talk about de facto freedom vs de jure freedom. For example in the context of free (as in freedom) software it is stressed that software ought to bear a free license -- this is to ensure de jure freedom, i.e. legal rights to being able to use, study, modify and share such software. However in these talks the de facto freedom of software is often forgotten; the legal (de jure) freedom is worth nothing if it doesn't imply real and practical (de facto) freedom to exercise the rights given by the license; for example if a piece of "free" (having a free license) software is extremely bloated, our practical ability to study and modify it gets limited because doing so gets considerably expensive and therefore limits the number of people who can truly exercise those rights in practice. This issue of diminishing de facto freedom of free software is addressed e.g. by the suckless movement, and of course our LRS movement.

There is also a similar situation regarding free speech: if speech is free only de jure, i.e. we can "in theory" legally speek relatively freely, BUT if then in reality we also CANNOT speek freely because e.g. of fear of being cancelled, our speech is de facto not free.


deferred_shading

Deferred Shading

In computer graphics programming deferred shading is a technique for speeding up the rendering of (mainly) shaded 3D graphics (i.e. graphics with textures, materials, normal maps etc.). It is nowadays used in many advanced 3D engines. In principle of course the idea may also be used in 2D graphics and outside graphics.

The principle is following: in normal forward shading (non-deferred) the shading computation is applied immediately to any rendered pixel (fragment) as they are rendered. However, as objects can overlap, many of these expensively computed pixels may be overwritten by pixels of other objects, so many pixels end up being expensively computed but invisible. This is of course wasted computation. Deferred shading only computes shading of the pixels that will end up actually being visible -- this is achieved by two rendering passes:

  1. At first geometry is rendered without shading, only with information that is needed for shading (for example normals, material IDs, texture IDs etc.). The rendered image is stored in so called G-buffer which is basically an image in which every pixel stores the above mentioned shading information.
  2. The second pass applies the shading effects by applying the pixel/fragment shader on each pixel of the G-buffer.

This is especially effective when we're using very expensive/complex pixel/fragment shaders AND we have many overlapping objects. Sometimes deferred shading may be replaced by simply ordering the rendered models, i.e. rendering front-to-back, which may achieve practically the same speed up. In simple cases deferred shading may not even be worth it -- in LRS programs we may use it only rarely.

Deferred shading also comes with complications, for example rasterization anti aliasing can't be used because, of course, anti-aliasing in G-buffer doesn't really make sense. This is usually solved by some screen-space antialiasing technique such as FXAA, but of course that may be a bit inferior. Transparency also poses an issue.


democracy

Democracy

Democracy stands for rule of the people, it is a form of government that somehow lets all citizens collectively make political decisions, which is usually implemented by voting but possibly also by other means. The opposite of democracy is autocracy (for example dictatorship), the absolute rule of a single individual. It can also be contrasted with oligarchy, the rule of a few (e.g. plutocracy, the rule of the rich, which we see under capitalism). Democracy may take different forms, e.g. direct (people directly vote on specific questions) or representative (people vote for officials who then make decisions on their behalf).

Democracy does NOT equal voting, even though this simplification is too often made. Voting doesn't imply democracy and democracy doesn't require voting, an alternative to voting may be for example a scientifically made decision. Democracy in the wide sense doesn't even require a state or legislation -- true democracy simply means that rules and actions of a society are controlled by all the people and in a way that benefits all the people. Even though we are led to believe we live in democratic society, the truth is that a large scale largely working democracy has never been established and that nowadays most of so called democracy is just an illusion as society clearly works for the benefit of the few richest and most powerful people while greatly abusing everyone else, especially the poorest majority of people. We do NOT live in true democracy. A true democracy would be achieved by ideal models of society such as those advocated by (true) anarchism or LRS, however some anarchists may be avoiding the use the term democracy as that in many narrower contexts implies an existence of government.

Nowadays the politics of most first world countries is based on elections and voting by people, but despite this being called democracy by the propaganda the reality is de facto not a democracy but rather an oligarchy, the rule THROUGH the people, creating an illusion of democracy which however lacks a real choice (e.g. the US two party system in which people can either vote for capitalists or capitalists) or pushes the voters towards a certain choice by huge propaganda, misinformation and manipulation.

Small brain simplification of democracy to mere "voting" may be highly ineffective and even dangerous. Democracy was actually considered to be very weak or even downright bad by many Greek philosophers such as Plato and Aristotle. We have to realize that sometimes voting is awesome, but sometimes it's an extremely awful idea. Why? Consider the two following scenarios:

What happens when it is democratically decided that democracy is not a good tool for decision making? If we believe in democracy, then we have to accept its decision and stop believing in democracy, but then if we stop believing in democracy we can just reject the decision because it was made by something that's not to be trusted, but then...


demo

Demo

Demo may stand for:


demoscene

Demoscene

Demoscene is a hacker art subculture revolving around making so called demos, programs that produce rich and interesting audiovisual effects and which are sometimes limited by strict size constraints (so called intros). The scene originated in northern Europe sometime in 1980s (even though things like screen hacks existed long before) among groups of crackers who were adding small signature effect screens into their cracked software (like "digital graffiti"); programming of these cool effects later became an art of its own and started to have their own competitions (sometimes with high financial prizes), so called compos, at dedicated real life events called demoparties (which themselves evolved from copyparties, real life events focused on piracy). The community is still centered mostly in the Europe (primarily Finland, in some countries demoscene was even officially added to the cultural heritage), it is underground, out of the mainstream; Wikipedia says that by 2010 its size was estimated to 10000 people (such people are called demosceners).

Demoscene is a bittersweet topic: on one side it's awesome, full of beautiful hacking, great ideas and minimalism, on the other side there are secretive people who don't share their source code (most demos are proprietary) and ugly unportable programs that exploit quirks of specific platforms -- common ones are DOS, Commodore 64, Amiga or Windows. These guys simply try to make the coolest visuals and smallest programs, with all good and bad that comes with it. Try to take only the good of it.

Besides "digital graffiti" the scene is also perhaps a bit similar to the culture of street rap, except that there's less improvisation (obviously, making a program takes long) and competition happens between groups rather than individuals. Nevertheless the focus is on competition, originality, style etc. But demos should show off technological skills as the highest priority -- trying to "win by content" rather than programming skills is sometimes frowned upon. Individuals within a demogroup have roles such as a programmer, visual artist, music artist, director, even PR etc.

A demo isn't a video, it is a non-interactive real time executable that produces the same output on every run (even though categories outside of this may also appear). Viznut has noted that this "static nature" of demos may be due to the established culture in which demos are made for a single show to the audience. Demos themselves aren't really limited by resource constraints (well, sometimes a limit such as 4 MB is imposed), it's where the programmers can show off all they have. However compos are often organized for intros, demos whose executable size is limited (i.e. NOT the size of the source code, like in code golfing, but the size of the compiled binary). The main categories are 4Kib intros and 64Kib intros, rarely also 256Kib intros (all sizes are in kibibytes). Apparently even such categories as 256 byte intro appear. Sometimes also platform may be specified (e.g. Commodore 64, PC etc.). The winner of a compo is decided by voting.

Some of the biggest demoparties are or were Assembly (Finland), The Party (Denmark), The Gathering (Norway), Kindergarden (Norway) and Revision (Germany). A guy on https://mlab.taik.fi/~eye/demos/ says that he has never seen a demo female programmer and that females often have free entry to demoparties while men have to pay because there are almost no women anyway xD Some famous demogroups include Farbrausch (Germany, also created a tiny 3D shooter game .kkrieger), Future Crew (Finland), Pulse (international), Haujobb (international), Conspiracy (Hungary) and Razor 1911 (Norway). { Personally I liked best the name of a group that called themselves Byterapers. ~drummyfish } There is an online community of demosceners at at https://www.pouet.net.

On technological side of demos: great amount of hacking, exploitation of bugs and errors and usage of techniques going against "good programming practices" are made use of in making of demos. Demosceners make use and invent many kinds of effects, such as the plasma (cycling color palette on a 2D noise pattern), copper bars, moire patterns, waving, lens distortion etc. Demos are usually written in C, C++ or assembly (though some retards even make demos in Java lmao). In intros it is extremely important to save space wherever possible, so things such as procedural generation and compression are heavily used. Manual assembly optimization for size can take place. Tracker music, chiptune, fractals and ASCII art are very popular. New techniques are still being discovered, e.g. bytebeat. GLSL shader source code that's to be embedded in the executable has to be minified or compressed. Compiler flags are chosen so as to minimize size, e.g. small size optimization (-Os), turning off buffer security checks or turning on fast float operations. The final executable is also additionally compressed with specialized executable compression.

See Also


dependency

Dependency

Dependency of a piece of technology is another piece of technology that's required for the former to work (typically e.g. a software library that's required by given computer program). Dependencies are bad! Among programmers the term dependency hell refers to a very common situation of having to deal with the headaches of managing dependencies. Unfortunately dependencies are also unavoidable. We at least try to minimize dependencies as much as possible while keeping our program functioning as intended, and those we can't avoid we try to abstract (see portability) in order to be able to quickly drop-in replace them with alternatives.

Having many dependencies is a sign of bloat and bad design. Unfortunately this is the reality of mainstream programming. For example at the time of writing this Chromium in Debian requires (recursively) 395 packages LMAO xD And these are just runtime dependencies...

In software development context we usually talk about software dependencies, typically libraries and other software packages. However, there are many other types of dependencies we need to consider when striving for the best programs. Let us list just some of the possible types:

Good program will take into account all kinds of these dependencies and try to minimize them to offer freedom, stability and safety while keeping its functionality or reducing it only very little.

Why are dependencies so bad? Because your program is for example:

How to Avoid Them

TODO


determinism

Determinism

"God doesn't play dice." --some German dude

Deterministic system (such as a computer program or an equation) is one which over time evolves without any involvement of randomness; i.e. its current state along with the rules according to which it behaves unambiguously and precisely determine its following states. This means that a deterministic algorithm will always give the same result if run multiple times with the same input values. Determinism is an extremely important concept in computer science and programming (and in many other fields of science and philosophy).

Determinism is also a philosophical theory and aspect of physics theories -- here it signifies that our Universe is deterministic, i.e. that everything is already predetermined by the state of the universe and the laws of physics, i.e. that we don't have "free will" (whatever it means) because our brains are just machines following laws of physics like any other matter etc. Many normies believe quantum physics disproves determinism which is however not the case, there may e.g. exist hidden variables that still make quantum physics deterministic -- some believe the Bell test disproved hidden variables but again this is NOT the case as it relies on statistical independence of the experimenters, determinism is already possible if we consider the choices of experimenters are also predetermined (this is called superdeterminism). Einstein and many others still believed determinism was the way the Universe works even after quantum physics emerged. { This also seems correct to me. Sabine Hossenfelder is another popular physicist promoting determinism. ~drummyfish } Anyway, this is already beyond the scope of technological determinism.

Computers are mostly deterministic by nature and design, they operate by strict rules and engineers normally try to eliminate any random behavior as that is mostly undesirable (with certain exceptions mentioned below) -- randomness leads to hard to detect and hard to fix bugs, unpredictability etc. Determinism has furthermore many advantages, for example if we want to record a behavior of a deterministic system, it is enough if we record only the inputs to the system without the need to record its state which saves a great amount of space -- if we later want to replay the system's behavior we simply rerun the system with the recorded inputs and its behavior will be the same as before (this is exploited e.g. in recording gameplay demos in video games such as Doom).

Determinism can however also pose a problem, notable e.g. in cryptography where we DO want true randomness e.g. when generating seeds. Determinism in this case implies an attacker knowing the conditions under which we generated the seed can exactly replicate the process and arrive at the seed value that's supposed to be random and secret. For this reason some CPUs come with special hardware for generating truly random numbers.

Despite the natural determinism of computers as such, computer programs nowadays aren't always automatically deterministic -- if you're writing a typical interactive computer program under some operating system, you have to make some extra bit of effort to make it deterministic. This is because there are things such as possible difference in timings or not perfectly specified behavior of floating point types in your language; for example a game running on slower computer will render fewer frames per second and if it has FPS-dependent physics, the time step of the physics engine will be longer on this computer, possibly resulting in slightly different physics behavior due to rounding errors. This means that such program run with the same input data will produce different results on different computers or under slightly different circumstances, i.e. it would be non-deterministic.

Nevertheless we almost always want our programs to be deterministic (or at least deterministic under some conditions, e.g. on the specific hardware platform we are using), always try to make your programs deterministic unless you have a VERY good reason not to! It doesn't take a huge effort to achieve determinism, it's more of just taking the right design decisions (e.g. separating rendering and physics simulation), i.e. good programming leads to determinism and vice versa, determinism in your program indicates good programming. The reason why we want determinism is that such programs have great properties, e.g. that of easier debugging (bugs are reproducible just by knowing the exact inputs), easy and efficient recording of activity (e.g. demos in games), sometimes even time reversibility (like undos, but watch out -- this doesn't hold in general!). Determinism also itself serves as a kind of a test if the program is working right -- if your program can take recorded inputs and reproduce same behavior at every run, it shows that it's written well, without things like undefined behavior affecting its behavior.

{ The previous paragraph is here because I've talked to people who thought that determinism was some UBER feature that requires a lot of work and so on ("OMG Trackmania is deterministic, what a feat!") -- this is NOT the case. It may intuitively seem so to non-programmers or beginners, but really this is not the case. Non-determinism in software appears usually due to a fuck up, ignorance or bad design choice made by someone with a low competence. Trust me, determinism is practically always the correct way of making programs and it is NOT hard to do. ~drummyfish }

Even if we're creating a program that somehow works with probability, we usually want to make it deterministic! This means we don't use actual random numbers but rather pseudorandom number generators that output chaotic values which simulate randomness, but which will nevertheless be exactly the same when ran multiple times with the same initial seed. This is again important e.g. for debugging the system in which replicating the bug is key to fixing it. If under normal circumstances you want the program to really behave differently in each run, you make it so only by altering its initial random seed.

In theoretical computer science non-determinism means that a model of computation, such as a Turing machine, may at certain points decide to make one of several possible actions which is somehow most convenient, e.g. which will lead to finding a solution in shortest time. Or in other words it means that the model makes many computations, each in different path, and at the end we conveniently pick the "best" one, e.g. the fastest one. Then we may talk e.g. about how the computational strength or speed of computation differ between a deterministic and non-deterministic Turing machine etc.

Determinism does NOT guarantee reversibility, i.e. if we know a state of a deterministic system, it may not always be possible to say from which state it evolved, or in other words: a system that's deterministic may or may not be deterministic in reverse time direction. This reversibility is only possible if the rules of the system are such that no state can evolve from two or more different states. If this holds then it is always possible to time-reverse the system and step it backwards to its initial state. This may be useful for things such as undos in programs. Also note that even if a system is reversible, it may be computationally very time consuming and sometimes practically impossible to reverse the system (imagine e.g. reversing a cryptographic hash -- mathematical reversibility of such hash may be arbitrarily ensured by e.g. pairing each hash with the lowest value that produces it).

Is floating point deterministic? In theory even floating point arithmetic can of course be completely deterministic but there is the question of whether this holds about concrete specifications and implementations of floating point (e.g. in different programming languages) -- here in theory non-determinism may arise e.g. by some unspecified behavior such as rounding rules. In practice you can't rely on float being deterministic. The common float standard, IEEE 754, is basically deterministic, including rounding etc. (except for possible payload of NaNs, which shouldn't matter in most cases), but this e.g. doesn't hold for floating point types in C!


devuan

Devuan

Devuan is a GNU/Linux distribution that's practically ideantical to Debian (it is its fork) but without systemd as well as without packages that depend on the systemd malware. Devuan offers a choice of several init systems, e.g. openrc, sysvinit and runit. It was first released in 2017.

Notice how Devuan rhymes less with lesbian than Debian.

Despite some flaws (such as being Linux with all the bloat and proprietary blobs), Devuan is still one of the best operating systems for most people and it is at this time recommended by us over most other distros not just for avoiding systemd, but mainly for its adoption of Debian free software definition that requires software to be free as a whole, including its data (i.e. respecting also free culture). It is also a nicely working unix system that's easy to install and which is still relatively unbloated.

{ I can recommend Devuan, I've been using it as my main OS for several years. ~drummyfish }


dick_reveal

Dick Reveal

Dick/pussy reveal is the act of publicly revealing one's nudity on the Internet, in normal situations done voluntarily.

See Also


digital

Digital

Digital technology is that which works with whole numbers, i.e. discrete values, as opposed to analog technology which works with real numbers, i.e. continuous values (note: do not confuse things such as floating point with truly continuous values!). The name digital is related to the word digit as digital computers store data by digits, e.g. in 1s and 0s if they work in binary.

Normies confuse digital with electronic or think that digital computers can only be electronic, that digital computers can only work in binary or have other weird assumptions whatsoever. This is indeed false! An abacus is a digital device, a book with text is a digital data storage. Fucking normies RIP.

The advantage of digital technology is its resilience to noise which prevents degradation of data and accumulation of error -- if a digital picture is copied a billion times, it will very likely remain unchanged, whereas performing the same operation with analog picture would probably erase most of the information it bears due to loss of quality in each copy. Digital technology also makes it easy and practically possible to create fully programmable general purpose computers of great complexity.

Digital vs analog, simple example: imagine you draw two pictures with a pencil: one in a normal fashion on a normal paper, the other one on a grid paper, by filling specific squares black. The first picture is analog, i.e. it records continuous curves and position of each point of these curves can be measured down to extremely small fractions of millimeters -- the advantage is that you are not limited by any grid and can draw any shape at any position on the paper, make any wild curves with very fine details, theoretically even microscopic ones. The other picture (on a square grid) is digital, it is composed of separate points whose position is described only by whole numbers (x and y coordinates of the filled grid squares), the disadvantage is that you are limited by only being able to fill squares on predefined positions so your picture will look blocky and limited in amount of detail it can capture (anything smaller than a single grid square can't be captured properly), the resolution of the grid is limited, but as we'll see, imposing this limitations has advantages. Consider e.g. the advantage of the grid paper image with regards to copying: if someone wants to copy your grid paper image, it will be relatively easy and he can copy it exactly, simply by filling the exact same squares you have filled -- small errors and noise such as imperfectly filled squares can be detected and corrected thanks to the fact that we have limited ourselves with the grid, we know that even if some square is not filled perfectly, it was probably meant to be filled and we can eliminate this kind of noise in the copy. This way we can copy the grid paper image a million times and it won't change. On the other hand the normal, non-grid image will become distorted with every copy and in fact even the original image will become distorted by aging; even if that who is copying the image tries to trace it extremely precisely, small errors will appear and these errors will accumulate in further copies, and any noise that appears in the image or in the copies is a problem because we don't know if it really is a noise or something that was meant to be in the image.

Of course, digital data may become distorted too, it is just less likely and it's easier to deal with this. It for example happens that space particles (and similar physics phenomena, e.g. electronic interference) flip bits in computer memory, i.e. there is always a probability of some bit flipping from 0 to 1 or vice versa. We call this data corruption. This may also happen due to physical damage to digital media (e.g. scratches on the surface of CDs), imperfections in computer network transmissions (e.g. packet loss over wifi) etc. However we can introduce further measures to prevent, detect and correct data corruption, e.g. by keeping redundant copies (2 copies of data allow detecting corruption, 3 copies allow even its correction), keeping checksums or hashes (which allow only detection of corruption but don't take much extra space), employing error correcting codes etc.

Another way in which digital data can degrade similarly to analog data is reencoding between lossy-compressed formats (in the spirit of the famous "needs more jpeg" meme). A typical example is digital movies: as new standard for video encoding are emerging, old movies are being reconverted from old formats to the new ones, however as video is quite heavily lossy-compressed, losses and distortion of information happens between the reencodings. This is best seen in videos and images circulating on the internet that are constantly being ripped and converted between different formats. This way it may happen that digital movies recorded nowadays may only survive into the future in very low quality, just like old analog movies survived until today in degraded quality. This can be prevented by storing the original data only with lossless compression and with each new emerging format create the release of the data from the original.


digital_signature

Digital Signature

Digital signature is a method of mathematically (with cryptographical algorithms) proving that, with a very high probability, a digital message or document has been produced by a specific sender, i.e. it is something aka traditional signature which gives a "proof" that something has been written by a specific individual.

It works on the basis of asymmetric cryptography: the signature of a message is a pair of a public key and a number (the signature) which can only have been produced by the owner of the private key associated with the public key. This signature is dependent on the message data itself, i.e. if the message is modified, the signature will no longer be valid, preventing anyone who doesn't posses the private key from modifying the message. The signature number can for example be a hash of the message decoded with the private key -- anyone can check that the signature encoded with the public key gives the document hash, proving that whoever computed the signature number must have possessed the private key.

Signatures can be computed e.g. with the RSA algorithm.

The nice thing here is that anonymity can be kept with digital signatures; no private information such as the signer's real name is required to be revealed, only his public key. Someone may ask why we then even sign documents if we don't know by whom it is signed lol? But of course the answer is obvious: many times we don't need to know the identity of the signer, we just need to know that different messages have all been written by the same person, and this is what a digital signature can ensure. And of course, if we want, a public key can have a real identity assigned if desirable, it's just that it's not required.


dinosaur

Dinosaur

In the hacker jargon dinosaur is a type of a big, very old, mostly non-interactive (batch), possibly partly mechanical computer, usually an IBM mainframe from 1940s and 1950s (so called Stone Age). They resided in dinosaur pens (mainframe rooms).

{ This is how I understood it from the Jargon File. ~drummyfish }


disease

Disease

TODO

The following is a list of common diseases.


distance

Distance

Distance is a measure of how far away from each other two points are. Most commonly distance refers to physical separation in space, e.g. as in distance of planets from the Sun, but more generally distance can refer to any kind of parameter space and in any number of dimensions, e.g. the distance of events in time measured in seconds (1D distance) or distance of two text strings as the amount of their dissimilarity (Levenshtein distance). Distances are extremely important in computer science and math as they allow us to do such things as clustering, path searching, physics simulations, various comparisons, sorting etc.

Distance is similar/related to length, the difference is that distance is computed between two points while length is the distance of one point from some implicit origin.

There are many ways to define distance within given space. Most common and implicitly assumed distance is the Euclidean distance (basically the "straight line from point A to point B" whose length is computed with Euclidean Theorem), but other distances are possible, e.g. the taxicab distance (length of the kind of perpendicular path taxis take between points A and B in Manhattan, usually longer than straight line). Mathematically a space in which distances can be measured are called metric spaces, and a distance within such space can be any function dist (called a distance or metric function) that satisfies these axioms:

  1. dist(p,p) = 0 (distance from identical point is zero)
  2. Values given by dist are never negative.
  3. dist(p,q) = dist(q,p) (symmetry, distance between two points is the same in both directions).
  4. dist(a,c) <= dist(a,b) + dist(b,c) (triangle inequality)

Approximations

Computing Euclidean distance requires multiplication and most importantly square root which is usually a pretty slow operation, therefore many times we look for simpler approximations. Note that a possible approach here may also lead through computing the distance normally but using a fast approximation of the square root.

Two very basic and rough approximations of Euclidean distance, both in 2D and 3D, are taxicab (also Manhattan) and Chebyshev distances. Taxicab distance simply adds the absolute coordinate differences along each principal axis (dx, dy and dz) while Chebyshev takes the maximum of them. In C (for generalization to 3D just add one coordinate of course):

int distTaxi(int x0, int y0, int x1, int y1)
{
  x0 = x1 > x0 ? x1 - x0 : x0 - x1; // dx
  y0 = y1 > y0 ? y1 - y0 : y0 - y1; // dy
  
  return x0 + y0;
}

int distCheb(int x0, int y0, int x1, int y1)
{
  x0 = x1 > x0 ? x1 - x0 : x0 - x1; // dx
  y0 = y1 > y0 ? y1 - y0 : y0 - y1; // dy
  
  return x0 > y0 ? x0 : y0;
}

Both of these distances approximate a circle in 2D with a square or a sphere in 3D with a cube, the difference is that taxicab is an upper estimate of the distance while Chebyshev is the lower estimate. For speed of execution (optimization) it may also be important that taxicab distance only uses the operation of addition while Chebyshev may result in branching (if) in the max function which is usually not good for performance.

A bit more accuracy can be achieved by averaging the taxicab and Chebyshev distances which in 2D approximates a circle with an 8 segment polygon and in 3D approximates a sphere with 24 sided polyhedron. The integer-only C code is following:

int dist8(int x0, int y0, int x1, int y1)
{
  x0 = x1 > x0 ? x1 - x0 : x0 - x1; // dx
  y0 = y1 > y0 ? y1 - y0 : y0 - y1; // dy
    
  return (x0 + y0 + (x0 > y0 ? x0 : y0)) / 2;
}

{ The following is an approximation I came up with when working on tinyphysicsengine. While I measured the average and maximum error of the taxi/Chebyshev average in 3D at about 16% and 22% respectively, the following gave me 3% and 12% values. ~drummyfish }

Yet more accurate approximation of 3D Euclidean distance can be made with a 48 sided polyhedron. The principle is following: take absolute values of all three coordinate differences and order them by magnitude so that dx >= dy >= dz >= 0. This gets us into one of 48 possible slices of space (the other slices have the same shape, they just differ by ordering or signs of the coordinates but the distance in them is of course equal). In this slice we'll approximate the distance linearly, i.e. with a plane. We do this by simply computing the distance of our point from a plane that goes through origin and whose normal is approximately {0.8728,0.4364,0.2182} (it points in the direction that goes through the middle of space slice). The expression for the distance from this plane simplifies to simply 0.8728 * dx + 0.4364 * dy + 0.2182 * dz. The following is an integer-only implementation in C (note that the constants above have been converted to allow division by 1024 for possible optimization of division to a bit shift):

int32_t dist48(
  int32_t x0, int32_t y0, int32_t z0,
  int32_t x1, int32_t y1, int32_t z1)
{
  x0 = x1 > x0 ? x1 - x0 : x0 - x1; // dx
  y0 = y1 > y0 ? y1 - y0 : y0 - y1; // dy
  z0 = z1 > z0 ? z1 - z0 : z0 - z1; // dz
 
  if (x0 < y0) // order the coordinates
  {
    if (x0 < z0)
    {
      if (y0 < z0)
      { // x0 < y0 < z0
        int32_t t = x0; x0 = z0; z0 = t;
      }
      else
      { // x0 < z0 < y0
        int32_t t = x0; x0 = y0; y0 = t;
        t = z0; z0 = y0; y0 = t;
      }
    }
    else
    { // z0 < x0 < y0
      int32_t t = x0; x0 = y0; y0 = t;
    }
  }
  else
  {
    if (y0 < z0)
    {
      if (x0 < z0)
      { // y0 < x0 < z0
        int32_t t = y0; y0 = z0; z0 = t;
        t = x0; x0 = y0; y0 = t;  
      }
      else
      { // y0 < z0 < x0
        int32_t t = y0; y0 = z0; z0 = t;
      }
    }
  }
    
  return (893 * x0 + 446 * y0 + 223 * z0) / 1024;
}

A similar approximation for 2D distance is (from a 1984 book Problem corner) this: sqrt(dx^2 + dy^2) ~= 0.96 * dx + 0.4 * dy for dx >= dy >= 0. The error is <= 4%. This can be optionally modified to use the closest power of 2 constants so that the function becomes much faster to compute, but the maximum error increases (seems to be about 11%). C code with fixed point follows (commented out line is the faster, less accurate version):

int dist2DApprox(int x0, int y0, int x1, int y1)
{
  x0 = x0 > x1 ? (x0 - x1) : (x1 - x0);
  y0 = y0 > y1 ? (y0 - y1) : (y1 - y0);
  
  if (x0 < y0)
  {
    x1 = x0; // swap
    x0 = y0;
    y0 = x1;
  }
  
  return (123 * x0 + 51 * y0) / 128; // max error = ~4%
  //return x0 + y0 / 2;              // faster, less accurate  
}

TODO: this https://www.flipcode.com/archives/Fast_Approximate_Distance_Functions.shtml


distrohopping

Distrohopping

Distrohopping is a serious mental disease that makes people waste lives on constantly switching GNU/linux distributions ("distros"). This affects mostly those of the lowest skill in tech who feel the need to LARP as wannabe tech nerds; as it's been said, an amateur is obsessed with tools, an expert is obsessed with mastery (Richard Stallman has for example famously never installed GNU/Linux himself as he has better things to do) -- a true programmer will just settle with a comfy Unix environment that can run vim and dedicate his time to creating a timeless source code while the hopper, like a mere animal, is just busy masturbating to a new bastard child of Ubuntu and Arch Linux that adds a new wallpaper and support for vertical mice -- such an activity is basically as retarded as mainstream tech consumerism with the only difference being a hopper isn't limited by finance so he can just distrohop 24/7 and hop himself to death.

TODO: cure? take the bsd pill? :-)


dodleston

Dodleston Mystery

The Dodleston mystery regards a teacher Ken Webster who in 1984 supposedly started exchanging messages with people from the past and future, most notably people from the 16th and 22nd century, via files on a BBC micro computer. While probably a hoax and creepypasta, there are some interesting unexplained details... and it's a fun story.

The guy has written a proprietary book about it, called The Vertical Plane.

{ If the story is made up and maybe even if it isn't it may be a copyright violation to reproduce the story with all the details here so I don't know if I should, but reporting on a few facts probably can't hurt. Yes, this is how bad the copyrestriction laws have gotten. ~drummyfish }


dog

Dog

Here is the dog! He doesn't judge you; dog love is unconditional. No matter who you are or what you ever did, this buddy will always love you and be your best friend <3 By this he is giving us a great lesson.

He loves when you pet him and take him for walks, but most of all he probably enjoys to play catch :) Throw him a ball!

Send this to anyone who's feeling down :)

         __
  _     /  \
 ((    / 0 0)
  \\___\/ _o)
  (        |  WOOOOOOOF
  | /___| |(
  |_)_) |_)_)

See Also


doom

Doom

Doom is a legendary video game released in 1993, perhaps the most famous video game of all time, the game that popularized the first person shooter genre and shocked by its at the time extremely advanced 3Dish graphics. It was made by Id Software, most notably by John Carmack (graphics + engine programmer) and John Romero (tool programmer + level designer). Doom is sadly proprietary, it was originally distributed as shareware (a free "demo" was available for playing and sharing with the option to buy a full version). However the game engine was later (1999) released as free (as in freedom) software under GPL which gave rise to many source ports. The assets remain non-free but a completely free alternative is offered by the Freedoom project that has created free as in freedom asset replacements for the game. Anarch is an official LRS game inspired by Doom, completely in the public domain.

{ Great books about Doom I can recommend: Masters of Doom (about the development) and Game Engine Black Book: Doom (details about the engine internals). ~drummyfish }

Partially thanks to the release of the engine under a FOSS license and its relatively suckless design (C language, software rendering, ...), Doom has been ported, both officially and unofficially, to a great number of platforms (e.g. Gameboy Advance, PS1, even SNES) and has become a kind of de facto standard benchmark for computer platforms -- you will often hear the phrase: "but does it run Doom?" Porting a Doom to any platform has become kind of a meme, someone allegedly even ported it to a pregnancy test (though it didn't actually run on the test, it was really just a display). { Still Anarch may be even more portable than Doom :) ~drummyfish }

The major leap that Doom engine's graphics brought was unprecedented, however Doom was not just a game with good graphics, it had extremely good gameplay, legendary music and art style and introduced the revolutionary deathmatch multiplayer, as well as a HUGE modding and mapping community. It was a success in every way -- arguably no other game has since achieved a greater revolution than Doom (no, not even Minecraft, World of Warcraft etc.). Many reviews of it just went along the lines: "OK, Doom is the best game ever made. now let's just take a look at the details...".

The game's backstory was simple and didn't stand in the way of gameplay, it's basically about a tough marine (so called Doomguy) on a Mars military base slaying hordes of demons from hell, all in a rock/metal style with a lot of gore and over-the-top violence (chain saws n stuff).

LOL someone created a Doom system monitor for Unix systems called psDooM where the monsters in game are the operating system processes and killing the monsters kills the processes.

Doom was followed by Doom II in 1995, which "content-wise" was basically just a data disc, the same game with new levels and some minor additions. Later there were some other releases and rereleases, notable is Doom III from 2004, Doom 2016 ("reboot") and Doom: Eternal (2020).

Doom Engine/Code

Doom source code is written in C89 and is about 36000 lines of code long. The original system requirements stated roughly a 30 MHz CPU and 4 MB RAM as a minimum. It had 27 levels (9 of which were shareware), 8 weapons and 10 enemy types. The engine wasn't really as flexible in a way "modern" programmers expect, many things were hard coded, there was no scripting or whatever (see? you don't fucking need it), new games using the engine had to usually modify the engine internals.

The game only used fixed point, no float!

The Doom engine was revolutionary and advanced (not only but especially) video game graphics by a great leap, considering its predecessor Wolf3D was really primitive in comparison (Doom basically set the direction for future trends in games such as driving the development of more and more powerful GPUs in a race for more and more impressive visuals). Doom used a technique called BSP rendering (levels were made of convex 2D sectors that were then placed in a BSP tree which helped quickly sort the walls for rendering front-to-back) that was able to render realtime 3D views of textured (all walls, floors and ceilings) environments with primitive lighting (per-sector plus diminishing lighting), enemies and items represented by 2D billboards ("sprites"). No GPU acceleration was used, graphics was rendered purely with CPU (so called software rendering, GPU rendering would come with Doom's successor Quake, and would also later be brought to Doom by newer community made engines, though the original always looks the best). This had its limitations, for example the camera could not tilt up and down and the levels could not have rooms above other rooms. The geometry of levels was only static, i.e. it could not change during play (only height of walls could), because rendering was dependent on precomputed BSP trees (which is what made it so fast). For these reasons some call Doom "pseudo 3D" or 2.5D rather than "true 3D". Nevertheless, though with limitations, Doom did present 3D views and internally it did work with 3D coordinates (for example the player or projectiles have 2D position plus height coordinate), despite some dumb YouTube videos saying otherwise. For this reason we prefer to call Doom a primitive 3D engine, but 3D nonetheless. Other games later used the Doom engine, such as Heretic, Hexen and Strife. The Doom engine was similar to and competing with Build engine that ran games like Duke Nukem 3D, Blood and Shadow Warrior. All of these 90s shooters were amazing in their visuals and looked far better than any modern shit. Build engine games had similar limitations to those of the Doom engine but would improve on them (e.g. faking looking up and down by camera tilting, which could in theory be done in Doom too, or allowing sloped floor and dynamic level geometry).

Indexed (palette) mode with "only" 256 colors was used for rendering. Precomputed color tables were used to make dimming of colors faster.

Doom also has a deterministic FPS-independent physics which allows for efficient recording of demos of its gameplay and creating tool assisted speedruns, i.e. the time step of game simulation is fixed (35 tics per second). Such demos can be played back in high quality while being minuscule in size and help us in many other ways, for example for verifying validity of speedruns. This is very nice and serves as an example of a well written engine (unlike later engines from the same creators, e.g. those of Quake games which lacked this feature -- here we can see how things get progressively shittier in computer technology as we go forward in time).

There is no antialiasing in the engine, i.e. aliasing can be noticed on far-away textures, but it is suppressed by the use of low-res textures and dimming far-away areas. There is also no edge smoothing (kind of misledingly known as "antialiasing") in the geometry rendering, the engine is subpixel accurate in rendering of the top and bottoms of the walls, i.e. the line these boundaries form may result in rasterizing slightly different pixels even if the start and end pixel is the same, depending on the subpixel position of the start and endpoint -- this feature doesn't much help in static screenshots but makes animation nicer.


double_buffering

Double Buffering

In computer graphics double buffering is a technique of rendering in which we do not draw directly to video RAM, but instead to a second "back buffer", and only copy the rendered frame from back buffer to the video RAM ("front buffer") once the rendering has been completed; this prevents flickering and displaying of incompletely rendered frames on the display. Double buffering requires a significant amount of extra memory for the back buffer, however it is also necessary for how graphics is rendered today.

  here we are                        this is seen
    drawing                           on display 
      |                                   |
      V                                   V
  .--------.   when drawing is done   .--------.
  |        |      we copy this        |        |
  |  back  | -----------------------> | front  |
  | buffer |                          | buffer |
  |________|                          |________|

In most libraries and frameworks today you don't have to care about double buffering, it's done automatically. For this reason in many frameworks you often need to indicate the end of rendering with some special command such as flip, endFrame etc. If you're going lower level, you may need to implement double buffering yourself.

Though we encounter the term mostly in computer graphics, the principle of using a second buffer in order to ensure the result is presented only when it's ready can be applied also elsewhere.

Let's take a small example: say we're rendering a frame in a 3D game. First we render the environment, then on top of it we render the enemies, then effects such as explosions and then at the top of all this we render the GUI. Without double buffering we'd simply be rendering all these pixel into the front buffer, i.e. the memory that is immediately shown on the display. This would lead to the user literally seeing how first the environment appears, then enemies are drawn over it, then effects and then the GUI. Even if all this redrawing takes an extremely short time, it is also the case that the final frame will be shown for a very short time before another one will start appearing, so in the result the user will see huge flickering: the environment may look kind of normal but the enemies, effects and GUI may appear transparent because they are only visible for a fraction of the frame. The user also might be able to see e.g. enemies that are supposed to be hidden behind some object if that object is rendered after the enemies. With double buffering this won't happen as we perform the rendering into the back buffer, a memory which doesn't show on the display. Only when we have completed the frame in the back buffer, we copy it to the front buffer, pixel by pixel. Here the user may see the display changing from the old frame to the new one from top to the bottom, but he will never see anything temporary, and since the old and new frames are usually very similar, this top-to-bottom update may not even be distracting (it is addressed by vertical synchronization if we really want to get rid of it).

There also exists triple buffering which uses yet another additional buffer to increase FPS. With double buffering we can't start rendering a new frame into back buffer until the back buffer has been copied to the front buffer which may further be delayed by vertical synchronization, i.e. we have to wait and waste some time. With triple buffering we can start rendering into the other back buffer while the other one is being copied to the front buffer. Of course this consumes significantly more memory. Also note that triple buffering can only be considered if the hardware supports parallel rendering and copying of data, and if the FPS is actually limited by this... mostly you'll find your FPS bottleneck is elsewhere in which case it makes no sense to try to implement triple buffering. On small devices like embedded you probably shouldn't even think about this.

Double buffering can be made more efficient by so called page flipping, i.e. allowing to switch the back and front buffer without having to physically copy the data, i.e. by simply changing the pointer of a display buffer. This has to be somehow supported by hardware.

When do we actually need double buffering? Not always, we can avoid it or suppress its memory requirements if we need to, e.g. with so called frameless rendering -- we may want to do this e.g. in embedded programming where we want to save every byte of RAM. The mainstream computers nowadays simply always run on a very fast FPS and keep redrawing the screen even if the image doesn't change, but if you write a program that only occasionally changes what's on the screen (e.g. an e-book reader), you may simply leave out double buffering and actually render to the front buffer once the screen needs to change, the user probably won't notice any flicker during a single quick frame redraw. You also don't need double buffering if you're able to compute the final pixel color right away, for example with ray tracing you don't need any double buffering, unless of course you're doing some complex postprocessing. Double buffering is only needed if we compute a pixel color but that color may still change before the frame is finished. You may also only use a partial double buffer if that is possible (which may not be always): you can e.g. split the screen into 16 regions and render region by region, using only a 1/16th size double buffer. Using a palette can also make the back buffer smaller: if we use e.g. a 256 color palette, we only need 1 byte for every pixel of the back buffer instead of some 3 bytes for full RGB. The same goes for using a smaller resolution that is the actual native resolution of the screen.


downto

Downto Operator

In C the so called "downto" operator is a joke played on nubs. It goes like this: Did you know C has a hidden downto operator -->? Try it:

#include <stdio.h>

int main(void)
{
  int n = 20;

  while (n --> 10) // n goes down to 10
    printf("%d\n",n);

  return 0;
}

Indeed this compiles and works. In fact --> is just -- and > operators.


drummyfish

Drummyfish

Drummyfish (also known as tastyfish, drummy, drumy, smellyfish and i forcefeed my diarrhea to capitalism) is a programmer, anarchopacifist and proponent of free software/culture, who started this wiki and invented the kind of software it focuses on: less retarded software (LRS). Besides others he has written Anarch, small3dlib, raycastlib, smallchesslib, tinyphysicsengine, SAF and comun. He has also been creating free culture art and otherwise contributing to free projects such as OpenMW; he's been contributing with public domain art of all kind (2D, 3D, music, ...) and writings to Wikipedia (no longer cause ban), Wikimedia Commons (also banned now), opengameart, libregamewiki, freesound and others. Drummyfish is insane/neuroretarded, suffering from anxiety/depression/etcetc. (diagnosed avoidant personality disorder) and has more than once been called a schizo, though psychiatrists didn't officially diagnose him with schizophrenia (yet). Due to spreading uncensored truth, helping and loving others and revealing corruption he is banned and censored on many places on the Internet, including Wikipedia, Wikimedia Commons, 4chan, many subreddits, some Xonotic and Openarena servers etc. He also has no real life and is pretty retarded when it comes to leading projects or otherwise dealing with people or practical life. He is a wizard.

Drummyfish is the most physically disgusting bastard on Earth, no woman ever loved him, he is so ugly people get suicidal thoughts from seeing any part of him.

He loves all living beings, even those whose attributes he hates or who hate him. He is a vegetarian and here and there supports good causes, for example he donates hair and gives money to homeless people who ask for them.

Drummyfish has a personal website at www.tastyfish.cz, and a gopherhole at self.tastyfish.cz.

Photos of drummyfish: young, older (after being confronted with real life) and naked.

Drummyfish's real name is Miloslav Číž, he was born on 24.08.1990 and lives in Moravia, Czech Republic, Earth (he rejects the concept of a country/nationalism, the info here serves purely to specify a location). He is a more or less straight male of the white race. He started programming at high school in Pascal, then he went on to study compsci (later focused on computer graphics) in a Brno University of Technology and got a master's degree, however he subsequently refused to find a job in the industry, partly because of his views (manifested by LRS) and partly because of mental health issues (depressions/anxiety/avoidant personality disorder). He rather chose to stay closer to the working class and do less harmful slavery such as cleaning and physical spam distribution, and continues hacking on his programming (and other) projects in his spare time in order to be able to do it with absolute freedom.

In 2019 drummyfish has written a "manifesto" of his ideas called Non-Competitive Society that describes the political ideas of an ideal society. It is in the public domain under CC0 and available for download online.

{ Why doxx myself? Following the LRS philosophy, I believe information should be free. Censorship -- even in the name of privacy -- goes against information freedom. We should live in a society in which people are moral and don't abuse others by any means, including via availability of their private information. And in order to achieve ideal society we have to actually live it, i.e. slowly start to behave as if it was already in place. Of course, I can't tell you literally everything (such as my passwords etc.), but the more I can tell you, the closer we are to the ideal society. ~drummyfish }

He likes many things such as animals, peace, freedom, programming, math and games (e.g. Xonotic and OpenArena, even though he despises competitive behavior in real life).

Does drummyfish have divine intellect? Hell no, but thanks to his extreme tendency for isolation, great curiosity and obsession with truth he is possibly the only man on Earth completely immune to propaganda, he can see the world as it is, not as it is presented, so he feels it is his moral duty to share what he is seeing. He is able to overcome his natural dumbness and low IQ by tryharding and sacrificing his social and sexual life so that he can program more. If drummyfish can learn to program LRS, so can you.


dynamic_programming

Dynamic Programming

Dynamic programming is a programming technique that can be used to make many algorithms more efficient (faster). It works on the principle of repeatedly breaking given problem down into smaller subproblems and then solving one by one from the simplest and remembering already calculated results that can be reused later.

It is usually contrasted to the divide and conquer (DAC) technique which at the first sight looks similar but is in fact quite different. DAC also subdivides the main problem into subproblems, but then solves them recursively, i.e. it is a top-down method. DAC also doesn't remember already solved subproblem and may end up solving the same problem multiple times, wasting computational time. Dynamic programming on the other hand starts solving the subproblems from the simplest ones -- i.e. it is a bottom-up method -- and remembers solutions to already solved subproblems in some kind of a table which makes it possible to quickly reuse the results if such subproblem is encountered again. The order of solving the subproblems should be made such as to maximize the efficiency of the algorithm.

It's not the case that dynamic programming is always better than DAC, it depends on the situation. Dynamic programming is effective when the subproblems overlap and so the same subproblems WILL be encountered multiple times. But if this is not the case, DAC can easily be used and memory for the look up tables will be saved.

Example

Let's firstly take a look at the case when divide and conquer is preferable. This is for instance the case with many sorting algorithms such as quicksort. Quicksort recursively divides parts of the array into halves and sorts each of those parts: sorting each of these parts is a different subproblem as these parts (at least mostly) differ in size, elements and their order. The subproblems therefore don't overlap and applying dynamic programming makes little sense.

But if we tackle a problem such as computing Nth Fibonacci number, the situation changes. Considering the definition of Nth Fibonacci number as a "sum of N-1th and N-2th Fibonacci numbers", we might naively try to apply the divide and conquer method:

int fib(int n)
{
  return (n < 2) ? 
    n : // start the sequence with 0, 1
    fib(n - 1) + fib(n - 2); // else add two previous
}

But we can see this is painfully slow as calling fib(n - 2) computes all values already computed by calling fib(n - 1) all over again, and this inefficiency additionally appears inside these functions recursively. Applying dynamic programming we get a better code:

int fib(int n)
{
  if (n < 2)
    return n;
    
  int current = 1, prev = 0;
  
  for (int i = 2; i <= n; ++i)
  {
    int tmp = current;
    current += prev;
    prev = tmp;
  }
  
  return current;
}

We can see the code is longer, but it is faster. In this case we only need to remember the previously computed Fibonacci number (in practice we may need much more memory for remembering the partial results).


earth

Earth

Well, Earth is the planet we live on. It is the third planet from the Sun of our Solar system which itself is part of the Milky Way galaxy, Universe. So far it is the only known place to have life.

Now behold the grand rendering of the Earth map in ASCII (equirectangular projection):

X      v      v      v      v      v      v      v      v      v      v      v      v      v      v      v      X
                        .-,./"">===-_.----..----..      :     -==- 
                     -=""-,><__-;;;<""._         /      :                     -===-
    ___          .=---""""\/ \/ ><."-, "\      /"       :      .--._     ____   __.-""""------""""---.....-----..
> -=_  """""---""           _.-"   \_/   |  .-" /"\     :  _.''     "..""    """                                <
"" _.'ALASKA               {_   ,".__     ""    '"'   _ : (    _/|                                         _  _.. 
  "-._.--"""-._    CANADA    "--"    "\              / \:  ""./ /                                     _--"","/
   ""          \                     _/_            ",_/:_./\_.'                     ASIA            "--.  \/
>               }                   /_\/               \:EUROPE      __  __                           /\|       <
                \            ""=- __.-"              /"":_-. -._ _, /__\ \ (                       .-" ) >-
                 \__   USA      _/                   """:___"   "  ",     ""                   ,-. \ __//
                    |\      __ /                     /"":   ""._..../                          \  "" \_/
>                    \\_  ."  \|      ATLANTIC      /   :          \\   <'\                     |               <
                        \ \_/| -=-      OCEAN       )   :AFRICA     \\_.-" """\                .'
       PACIFIC           "--._\                     \___:            "/        \ .""\_  <^,..-" __
        OCEAN                 \"""-""-.._               :""\         /          "     | _)      \_\INDONESIA
>.............................|..........",.............:...\......./................_\\_....__/\..,__..........<                
                              |   SOUTH    \            :   /      |                 "-._\_  \__/  \  ""-_
                               \ AMERICA   /            :  (       }                     """""===-  """""_  
                                \_        |             :   \      \                          __.-""._,"",
>                                 \      /              :   /      / |\                     ," AUSTRALIA  \     <
                                  |     |               :   \     /  \/      INDIAN         ";   __        )
                                  |     /               :    \___/            OCEAN           """  ""-._  / 
                                 /     /                :                                               ""   |\
>                                |    /                 :                                               {)   // <
                                 |   |                  :                                                   ""
                                 \_  \                  :
                                   """                  :
>                                     .,                :                                                       <
                       __....___  _/""  \               :          _____   ___.......___......-------...__
--....-----""""----""""         ""      "-..__    __......--"""""""     """                              .;_..... 
                                              """"      : ANTARCTICA
X      ^      ^      ^      ^      ^      ^      ^      ^      ^      ^      ^      ^      ^      ^      ^      X

Some numbers about the planet Earth:


easier_done_than_said

Easier Done Than Said

Easier done than said is the opposite of easier said than done.

Example: exhaling, as saying the word "exhaling" requires exhaling plus doing some extra work such as correctly shaping your mouth.


easy_to_learn_hard_to_master

Easy To Learn, Hard To Master

"Easy to learn, hard to master" (ETLHTM) is a type of design of a game (and by extension a potential property of any art or skill) which makes it relatively easy to learn to play while mastering the play (playing in near optimal way) remains very difficult.

Examples of this are games such as tetris, minesweeper or Trackmania.

LRS sees the ETLHTM design as extremely useful and desirable as it allows for creation of suckless, simple games that offer many hours of fun. With this philosophy we get a great amount of value for relatively little effort.

This is related to a fun coming from self imposed goals, another very important and useful concept in games. Self imposed goals in games are goals the player sets for himself, for example completing the game without killing anyone (so called "pacifist" gameplay) or completing it very quickly (speedrunning). Here the game serves only as a platform, a playground at which different games can be played and invented -- inventing games is fun in itself. Again, a game supporting self imposed goals can be relatively simple and offer years of fun, which is extremely cool.

The simplicity of learning a game comes from simple rules while the difficulty of its mastering arises from the complex emergent behavior these simple rules create. Mastering of the game is many times encouraged by competition among different people but also competition against oneself (trying to beat own score). In many simple games such as minesweeper there exists a competitive scene (based either on direct matches or some measurement of skill such as speedrunning or achieving high score) that drives people to search for strategies and techniques that optimize the play, and to training skillful execution of such play.

The opposite is hard to learn, easy to master.

See Also


education

Education

not to be confused with indoctrination

TODO


elo

Elo

The Elo system (named after Arpad Elo, NOT an acronym) is a mathematical system for rating the relative strength of players of a certain game, most notably and widely used in chess but also elsewhere (video games, table tennis, ...). Based on number of wins, losses and draws against other Elo rated opponents, the system computes a number (rating) for each player that highly correlates with that player's current strength/skill; as games are played, ratings of players are constantly being updated to reflect changes in their strength. The numeric rating can then be used to predict the probability of a win, loss or draw of any two players in the system, as well as e.g. for constructing ladders of current top players and matchmaking players of similar strength in online games. For example if player A has an Elo rating of 1700 and player B 1400, player A is expected to win in a game with player B with the probability of 85%. Besides Elo there exist alternative and improved systems, notably e.g. the Glicko system (which further adds e.g. confidence intervals).

The Elo system was created specifically for chess (even though it can be applied to other games as well, it doesn't rely on any chess specific rules) and described by Arpad Elo in his 1978 book called The Rating of Chessplayers, Past and Present, by which time it was already in use by FIDE. It replaced older rating systems, most notably the Harkness system. Despite more "advanced" systems being around nowadays, Elo remains the most widely used one.

Elo rates only RELATIVE performance, not absolute, i.e. the rating number of a player says nothing in itself, it is only the DIFFERENCE in rating points between two players that matters, so in an extreme case two players rated 300 and 1000 in one rating pool may in another one be rated 10300 and 11000 (the difference of 700 is the only thing that stays the same, mean value can change freely). This may be influenced by initial conditions and things such as rating inflation (or deflation) -- if for example a chess website assigns some start rating to new users which tends to overestimate an average newcomer's abilities, newcomers will come to the site, play a few games which they will lose, then they ragequit but they've already fed their points to the good players, causing the average rating of a good player to grow over time.

Keep in mind Elo is a big simplification of reality, as is any attempt at capturing skill with a single number -- even though it is a very good predictor of something akin a "skill" and outcomes of games, trying to capture a "skill" with a single number is similar to e.g. trying to capture such a multidimensional thing as intelligence with a single dimensional IQ number. For example due to many different areas of a game to be mastered and different playstyles transitivity may be broken in reality: it may happen that player A mostly beats player B, player B mostly beats player C and player C mostly beats player A, which Elo won't capture.

How It Works

Initial rating of players is not specified by Elo, each rating organization applies its own method (e.g. assign an arbitrary value of let's say 1000 or letting the player play a few unrated games to estimate his skill).

Suppose we have two players, player 1 with rating A and player 2 with rating B. In a game between them player 1 can either win, i.e. score 1 point, lose, i.e. score 0 points, or draw, i.e. score 0.5 points.

The expected score E of a game between the two players is computed using a sigmoid function (400 is just a magic constant that's usually used, it makes it so that a positive difference of 400 points makes a player 10 times more likely to win):

E = 1 / (1 + 10^((B - A)/400))

For example if we set the ratings A = 1700 and B = 1400, we get a result E ~= 0.85, i.e in a series of many games player 1 will get an average of about 0.85 points per game, which can mean that out of 100 games he wins 85 times and loses 16 times (but it can also mean that out of 100 games he e.g. wins 70 times and draws 30). Computing the same formula from the player 2 perspective gives E ~= 0.15 which makes sense as the number of points expected to gain by the players have to add up to 1 (the formula says in what ratio the two players split the 1 point of the game).

After playing a game the ratings of the two players are adjusted depending on the actual outcome of the game. The winning player takes some amount of rating points from the loser (i.e. the loser loses the same amount of point the winner gains which means the total number of points in the system doesn't change as a result of games being played). The new rating of player 1, A2, is computed as:

A2 = A + K * (R - E)

where R is the outcome of the game (for player 1, i.e. 1 for a win, 0 for loss, 0.5 for a draw) and K is the change rate which affects how quickly the ratings will change (can be set to e.g. 30 but may be different e.g. for new or low rated players). So with e.g. K = 25 if for our two players the game ends up being a draw, player 2 takes 9 points from player 1 (A2 = 1691, B2 = 1409, note that drawing a weaker player is below the expected result).

How to compute Elo difference from a number of games? This is useful e.g. if we have a chess engine X with Elo EX and a new engine Y whose Elo we don't know: we may let these two engines play 1000 games, note the average result E and then compute the Elo difference of the new engine against the first engine from this formula (derived from the above formula by solving for Elo difference B - A):

B - A = log10(1 / E - 1) * 400

Some Code

Here is a C code that simulates players of different skills playing games and being rated with Elo. Keep in mind the example is simple, it uses the potentially imperfect rand function etc., but it shows the principle quite well. At the beginning each player is assigned an Elo of 1000 and a random skill which is normally distrubuted, a game between two players consists of each player drawing a random number in range from from 1 to his skill number, the player that draws a bigger number wins (i.e. a player with higher skill is more likely to win).

#include <stdio.h>
#include <stdlib.h>
#include <math.h>

#define PLAYERS 101
#define GAMES 10000
#define K 25          // Elo K factor

typedef struct
{
  unsigned int skill;
  unsigned int elo;
} Player;

Player players[PLAYERS];

double eloExpectedScore(unsigned int elo1, unsigned int elo2)
{
  return 1.0 / (1.0 + pow(10.0,((((double) elo2) - ((double) elo1)) / 400.0)));
}

int eloPointGain(double expectedResult, double result)
{
  return K * (result - expectedResult);
}

int main(void)
{
  srand(100);

  for (int i = 0; i < PLAYERS; ++i)
  {
    players[i].elo = 1000; // give everyone inital Elo of 1000

    // normally distributed skill in range 0-99:
    players[i].skill = 0;

    for (int j = 0; j < 8; ++j)
      players[i].skill += rand() % 100;

    players[i].skill /= 8;
  }

  for (int i = 0; i < GAMES; ++i) // play games
  {
    unsigned int player1 = rand() % PLAYERS,
                 player2 = rand() % PLAYERS;

    // let players draw numbers, bigger number wins
    unsigned int number1 = rand() % (players[player1].skill + 1),
                 number2 = rand() % (players[player2].skill + 1);

    double gameResult = 0.5;

    if (number1 > number2)
      gameResult = 1.0;
    else if (number2 > number1)
      gameResult = 0.0;
  
    int pointGain = eloPointGain(eloExpectedScore(
      players[player1].elo,
      players[player2].elo),gameResult);

    players[player1].elo += pointGain;
    players[player2].elo -= pointGain;
  }

  for (int i = PLAYERS - 2; i >= 0; --i) // bubble-sort by Elo
    for (int j = 0; j <= i; ++j)
      if (players[j].elo < players[j + 1].elo)
      {
        Player tmp = players[j];
        players[j] = players[j + 1];
        players[j + 1] = tmp;
      }

  for (int i = 0; i < PLAYERS; i += 5) // print
    printf("#%d: Elo: %d (skill: %d\%)\n",i,players[i].elo,players[i].skill);

  return 0;
}

The code may output e.g.:

#0: Elo: 1134 (skill: 62%)
#5: Elo: 1117 (skill: 63%)
#10: Elo: 1102 (skill: 59%)
#15: Elo: 1082 (skill: 54%)
#20: Elo: 1069 (skill: 58%)
#25: Elo: 1054 (skill: 54%)
#30: Elo: 1039 (skill: 52%)
#35: Elo: 1026 (skill: 52%)
#40: Elo: 1017 (skill: 56%)
#45: Elo: 1016 (skill: 50%)
#50: Elo: 1006 (skill: 40%)
#55: Elo: 983 (skill: 50%)
#60: Elo: 974 (skill: 42%)
#65: Elo: 970 (skill: 41%)
#70: Elo: 954 (skill: 44%)
#75: Elo: 947 (skill: 47%)
#80: Elo: 936 (skill: 40%)
#85: Elo: 927 (skill: 48%)
#90: Elo: 912 (skill: 52%)
#95: Elo: 896 (skill: 35%)
#100: Elo: 788 (skill: 22%)

We can see that Elo quite nicely correlates with the player's real skill.


elon_musk

Elon Mu$k

Elon Musk is an enormous capitalist dick.

TODO

Musk's company Neuralink killed 1500 animals in 4 years, was charged with animal cruelty (sauce).


e

E

Euler's number (not to be confused with Euler number), or e, is an extremely important and one of the most fundamental numbers in mathematics, approximately equal to 2.72, and is almost as famous as pi. It appears very often in mathematics and nature, it is the base of natural logarithm, its digits after the decimal point go on forever without showing a simple pattern (just as those of pi), and it has many more interesting properties.

It can be defined in several ways:

e to 100 decimal digits is:

2.7182818284590452353602874713526624977572470936999595749669676277240766303535475945713821785251664274...

e to 100 binary digits is:

10.101101111110000101010001011000101000101011101101001010100110101010111111011100010101100010000000100...

Just as pi, e is a real transcendental number (it is not a root of any polynomial equation) which also means it is an irrational number (it cannot be expressed as a fraction of integers). It is also not known whether e is a normal number, which would means its digits would contain all possible finite strings, but it is conjectured to be so.

TODO


encyclopedia

Encyclopedia

Encyclopedia (also encyclopaedia, cyclopedia or cyclopaedia, from Greek enkyklios paideia, roughly "general education") is a large book (or a series of books) providing structured summary of wide knowledge in one or many fields of knowledge (such as mathematics, history, engineering, general knowledge etc.), usually structured as a collection of alphabetically ordered articles on terms used in the field. Paper encyclopedias are oftentimes printed in several volumes as the amount of contained information is too great for a single book (in large ones you may even see one or two volumes dedicated ONLY for the index). The largest and most famous encyclopedia to date is the online Wikipedia created by volunteers in free culture spirit, however Wikipedia suffers from significant issues such as censorship, high political propaganda and low quality of writing, therefore it is important to also stay interested in other encyclopedias such as Britannica, Americana or LRS wiki.

Encyclopedias are awesome, get as many of them as you can, especially the printed ones -- they are usually relatively cheap (especially second hand books) and provide an ENORMOUS amount of information, FOREVER (no one can cancel your physically owned paper book, you will retain it even after the collapse when such books will become practically your only source of human knowledge). Also remember, paper books are still of much higher quality than online resources such as Wikipedia -- even if they lose in terms of shear volume, they make up in quality of writing and still many times contain information that's not available online, and the older ones are more objective and trustworthy, considering the decline of free speech online. Shorter articles may also do a better job at providing overall summary of a topic and filtering out less important information, as opposed to a gigantic Wikipedia article. Furthermore even if such a book isn't free as in freedom, the knowledge, information and data contained in it is in the public domain as such things cannot (yet) be owned, therefore it is possible to legally paraphrase the information into a new source which we may make public domain itself (however watch out to not merely copy-paste texts from encyclopedias as text CAN be copyrighted, as well as e.g. the mere selection of which facts to include; always be very careful).

{ A favorite pastime of mine is looking up the same term in different encyclopedias and comparing them -- this can help get to the essence of actually understanding the term, as well as revealing censorship and different views of the authors. ~drummyfish }

Great nerds read encyclopedias linearly from start to finish just like a normal book, which may help expand one's knowledge as well as ignite curiosity in new things and spot some cool interesting facts.

Similar terms: encyclopedias, which also used to be called cyclopedias in the past, are similar to dictionaries and these types of books often overlap (many encyclopedias call themselves dictionaries); the main difference is that a dictionary focuses on providing linguistic information and generally has shorter term definitions, while encyclopedias have longer articles (which however limits their total number, i.e. encyclopedias will usually prefer quality over quantity). Encyclopedias are also a subset of so called reference works, i.e. works that serve to provide information and reference to it (other kinds of reference works being e.g. world maps, tabulated values or API references). A universal/general encyclopedia is one that focuses on human knowledge at wide, as opposed to an encyclopedia that focuses on one specific field of knowledge. Compendium can be seen almost as a synonym to encyclopedia, with encyclopedias perhaps usually being more general and extensive. Almanac is also similar to encyclopedia, more focusing on tabular data. Micropedia is another term, sometimes used to denote a smaller encyclopedia (one edition of Britannica came with a micropedia as well as a larger macropedia).

These are some nice/interesting/benchmark articles to look up in encyclopedias: algorithm, anarchism, Andromeda (galaxy), Antarctica, Atlantis, atom, axiom of choice, Bible, big bang, black hole, brain, Buddhism, C (programming language), cannibalism, capitalism, castle, cat, censorship, central processing unit, chess, Chicxulub, China, color, comet, communism, computer, Creative Commons, Deep Blue, democracy, Democratic People's Republic of Korea, depression, determinism, dinosaur, dodo, dog, Doom (game), Earth, Einstein, Elo, Encyclopedia, entropy, ethics, Euler's Number, evolution, font, football, fractal, free software, game, gigantopythecus, go (game), god, GNU project, hacker, Hanging Gardens of Babylon, hardware, Hitler, Holocaust, homosexual, human, information, intelligence, Internet, IQ, Japan, Jesus, Jew, language, Latin, life, light, lightning, Linux, logarithm, logic, love, Mammoth, mathematics, Mariana Trench, Mars, Milky Way, Moon, morality, Mount Everest, music, necrophylia, Open Source, negro, nigger, pacifism, pedophilia, penis, pi, Pluto, prime number, quaternion, Pompei, Quran, race, Roman Empire, sex, sine, schizophrenia, software, Stallman (Richard), star, Stonehenge, suicide, Sun, Tibet, technology, Tetris, time, Titanic, transistor, Troy, Tyrannousaurus Rex, UFO, universe, Unix, Uruk, Usenet, Valonia Ventricosa (bubble algae), Vatican, Venus, video game, Wikipedia, woman, World War II, World Wide Web, ...

What is the best letter in an encyclopedia? If you are super nerdy, you may start to search for your favorite starting letter -- this if fun and may also help you e.g. decide which volume of your encyclopedia to take with you when traveling. Which letter is best depends on many things, e.g. the language of the encyclopedia, its size, your area of interest and so on. Assuming English and topics that would be interesting to the readers of LRS wiki, the best letter is most likely C -- it is the second most common starting letter in dictionaries, has a great span and includes essential and interesting terms such as computer, C programming language, cat, communism, capitalism, chess, christianity, collapse, CPU, color, culture, copyleft, compiler, creative commons, cryptography, copyright, car, cancer, cellular automata, consumerism, cosine, Chomsky, CIA, cybernetics, cracking, chaos, carbon, curvature, chemistry, censorship and others. As close second comes S, the most frequent letter in dictionaries, with terms such as Stallman, science, shader, semiconductor, silicon, software, sound, socialism, state, selflessness, speech recognition, steganography, square root, sudoku, suicide, speedrun, space, star, Sun, sine, Soviet union, schizophrenia, set, suckless, shit, sex and others. { This is based on a list I made where I assigned points to each letter. The letters that follow after C and S are P, M, A, E, T, L, R, F, D, G, I, B, H, U, N, W, V, J, O, K, Q, Z, Y, X. ~drummyfish }

Notable/Nice Encyclopedias

Here is a list of notable encyclopedias, focused on general knowledge English language ones. The most notable ones are in bold.

{ See also https://wikiindex.org/. ~drummyfish }

name year legal status format ~articles comment
Britannica 9th edition 1889 PD (old) 25 vol. legendary enc., major edition, one of "Big Three", partly digitized (archive.org, wikisource, ...)
Britannica 11th edition 1910 PD (old) 29 vol. 40K legendary enc., major edition, one of "Big Three", mostly digitized (both scan and txt), PC incorrect :)
Britannica Concise Encyclopedia 2002 proprietary 1 vol. 2000p 28K nice, short descriptions, condensed from the main multivol. Brit., piratable pdf
Britannica online ...now proprietary online 130K bloated, high quality articles, unpaid is limited and with ads
Citizendium 2006... proprietary? (NC) online 18K Wikipedia alternative, censored, faggots have unclear license
Collier's New Encyclopedia 1921 PD (old) 10 vol. NOT TO BE CONFUSED with Collier's Encyclopedia (different one), digitized on Wikisource (txt)
Columbia Encyclopedia 1935... proprietary 1 vol. ~3Kp ~50K high quality, lots of information { Read the 1993 edition, it's super nice. ~drummyfish }
Conservaedia 2006... proprietary online 52K American fascist wiki, has basic factual errors
Larousse Desk Reference Enc. 1995 proprietary 1 vol. 800p 200K? by James Hughes, nice, quality general overviews, topic-ordered { I bought this, it's nice. ~drummyfish }
Domestic Encyclopaedia 1802 PD (old) 4 vol. shorter articles, partially digitized on Wikisource
Encyclopedia Americana 1820... PD (old) ~30 vol. longer articles, one of "Big Three", several editions (1906, 1920) partly digitized on wikisource
Encyclopedia Dramatica 2004... PD (CC0) online 15K informal/fun/"offensive" but valuable info (on society, tech, ...), basically no censorship, no propaganda
Everybodywiki 2017... CC BY-SA online ~300K alternative to Wikipedia allowing articles on non notable things and people
Google Knol ~2010 proprietary online failed online enc. by Google, archived on archive.org
Grolier Multimedia Encyclopedia 2003 proprietary CD
Illustrated Family Encyclopedia 1997 proprietary 2 vol. 920p 5K kid-friendly, nice pictures, USA bias, piratable
Infogalactic 2016... CC BY-SA online 2M Wikipedia fork, no SJW censorship, FOR PROFIT (you can buy article control lol), can't make accounts
LRS wiki 2021... PD (CC0) online/elec. 500 best encyclopedia, focused on tech/society, no censorship
Metapedia 2006... GFDL online 7K Wikipedia fork, online, no SJW censorship, ATM limited account creation, "pro-European" fascism
Microsoft Encarta ...2009 proprietary electronic 62K Micro$oft enc., low quality articles (errors), MS propaganda (no free software etc. lol), is on archive.org
Simple English Wikipedia 2001... CC BY-SA online 200K Wikipedia with simpler language and simpler explanations, censored
The New American Cyclopaedia 1879 PD (old) 16 vol. partially digitized on Wikisource (txt)
The New International Encyc. 1905 PD (old) 20 vol. partially digitized on Wikisource
The Nuttall Encyclopaedia 1907 PD (old) 1 vol. 16K short articles, oldschool, digitized (gutenberg)
Vikidia 2006... CC BY-SA online 4K "Wikipedia for kids", probably as censored as Wikipedia
Webster's Unabridged Dictionary 1864 PD (old) paper 476K short descriptions, digitized (gutenberg)
Wikipedia 2001... CC BY-SA online 6M largest and most famous, EXTREME PSEUDOLEFTIST CENSORSHIP AND POLITICAL PROPAGANDA, free culture
Old Wikipedia 2001 GFDL online 19K archived old Wikipedia, less censorship, https://nostalgia.wikipedia.org
Pears' Cyclopedia 1897 PD (old) 1 vol. 740p contains dictionary, general knowl. maps, reference etc., scanned on archive.org
World Almanac and Book of Facts 1868... some PD (old) 1 vol. interesting and useful information, data and facts from old to new age, US-centered
The World Book 1917... proprietary 22 vol. 17K best selling print enc., large, probably high quality
The World Book 1917 1917 PD (old) 8 vol. 3K nicely readable
Uncyclopedia 2005... proprietary (NC) online 37K parody, fun enc., "more normie friendly dramatica"

See Also


english

English

"English Motherfucker, do you speak it?"

English is a natural human language spoken mainly in the USA, UK and Australia as well as in dozens of other countries and in all parts of the world. It is the default language of the world. It is a pretty simple and suckless language (even though not as suckless as Esperanto), even a braindead man can learn it { Knowing Czech and learning Spanish, which is considered one of the easier languages, I can say English is orders of magnitude simpler. ~drummyfish }. It is the lingua franca of the tech world and many other worldwide communities. Thanks to its simplicity (lack of declension, fixed word order etc.) it is pretty suitable for computer analysis and as a basis for programming languages.

If you haven't noticed, this wiki is written in English.


entrepreneur

Entrepreneur

Entrepreneur is an individual practicing legal slavery and legal theft under capitalism; capitalists describe those actions by euphemisms such as "doing business". Successful entrepreneurs can also be seen as murderers as they consciously firstly hoard resources that poor people lack (including basic resources needed for living) and secondly cause and perpetuate situations such as the third world slavery where people die on a daily basis performing extremely difficult, dangerous and low paid work, so that the entrepreneur can buy his ass yet another private jet.


entropy

Entropy

Entropy is a quite cryptic, often misunderstood scientific term that may have different definitions depending on specific field and context, which can intuitively be interpreted as an amount of disorder, uncertainty or randomness. There are two main kinds of entropy: information entropy (information theory) and thermodynamic entropy (physics).

Information Entropy

Information entropy is a basic concept in information theory -- watch out, this kind of entropy is different from entropy in physics (which is described below). We use entropy to express an "amount of hidden information" in events, messages, codes etc. This can be used e.g. to design compression algorithms, help utilize bandwidths better etc.

Let's first define what information means in this context (note that the meaning of information here is kind of mathematical, not exactly equal to the meaning of information used in common speech). For a random event (such as a coin toss) with probability p the amount of information we get by observing it is

I(p) = log2(1/p) = -1 * log2(p)

The unit of information here is bit (note the base 2 of the logarithm -- other bases can be used too but then the units are called differently), in information theory also known as shannon. Let's see how the definition behaves: the less probable an event is, the more information its observation gives us (with 0, i.e. impossible event, theoretically giving infinite information), while probability 1 gives zero information (observing something we know will happen tells us literally nothing).

Now an entropy of a random variable X, which can take values x1, x2, x3, ..., xn with probabilities q1, q2, q3, ..., qn is defined as

H(x) = sum(qi * Ii) = sum(qi * log2(1/qi))

How does entropy differ from information? Well, they are measured in the same units (bits), the difference is in the interpretation -- under the current context information is basically what we know, while entropy is what we don't know, the uncertainty. So entropy of a certain message (or rather of the probability distribution of possible messages to receive) says how much information will be gained by receiving it -- once we receive the message, the entropy kind of "turns into information", so the amount of information and entropy is actually the same. Perhaps the relationship is similar to that of energy and work in physics -- both are measured in the same units, energy is the potential for work and can be converted to it.

Entropy is greater if unpredictability ("randomness") is greater -- it is at its maximum if all possible values of the random variable are equally likely. For example entropy of a coin toss is 1 bit, given both outcomes are equally likely (if one outcome was more likely than the other, entropy would go down).

More predictable events have lower entropy -- for example English text has quite low entropy because it is pretty easy to predict missing letters from other letters (there is a lot of redundancy in human language). Thanks to this we can compress the text, e.g. using Huffman code -- compression reduces size, i.e. removes redundancy/correlation/predictability, and so increases entropy.

Example: consider a weather forecast for a specific area, day and hour -- our weather model predicts rain with 55% probability, cloudy with 30% probability and sunny with 15% probability. Once the specific day and hour comes, we will receive a message about the ACTUAL weather that there was in the area. What entropy does such message have? According to the formula above: H = 0.55 * log2(1/0.55) + 0.3 * log2(1/0.3) + 0.15 * log2(1/0.25) ~= 1.3 bits. That is the entropy and amount of information such message gives us.

How is information entropy related to the physics entropy?

TODO

Physics Entropy

TODO

But WHY does entropy increase in time-forward direction? One may ask if laws on nature are time-symmetric, why is the forward direction of time special in that entropy increases in that direction? Just WHY is it so? Well, it is not so really, entropy simply increases in both time-forward and time-backward directions from a point of low entropy. Such point of low entropy may be e.g. the Big Bang since which entropy has been increasing in the time direction that's from the Big Bang towards us. Or the low entropy point may be a compressed gas; if we let such gas expand its entropy will increase to the future, but we may also look to the past in which the gas had high entropy before we compressed it, i.e. here entropy locally increases also towards the past. This is shown in the following image:

time
^        future
|  . . . .  .  .. . .     higher entropy (gas has expanded)
|   . . .  .  .   . 
|    . . .  .  . .
|      .. . ..  ..
|        . .. ..
|_________....__________  low entropy (gas is compressed)
|        .. . .
|       . . .. .
|      .. ..  .  .
|    .  ..  . . . .
|   . . . .. .  . .. .    higher entropy (we start compressing)
v         past

esolang

Esoteric Programming Language

So called esoteric programming languages (esolangs) are highly experimental and fun programming languages that employ bizarre ideas. Popular languages of this kind include Brainfuck, Chef or Omgrofl.

There is a wiki for esolangs, the Esolang Wiki. If you want to behold esolangs in all their beauty, see https://esolangs.org/wiki/Hello_world_program_in_esoteric_languages_(nonalphabetic_and_A-M). The Wiki is published under CC0!

Some notable ideas employed by esolangs are:

Esolangs are great because:

History

INTERCAL, made in 1972 by Donald Woods and James Lyon, is considered the first esolang in history: its goal was specifically intended to be different from traditional languages and so for example a level of politeness was introduced -- if there weren't enough PLEASE labels in the source code, the compiler wouldn't compile the program.

In 2005 esolang wiki was started.

Specific Languages

The following is a list of some notable esoteric languages.

See Also


ethics

Ethics

Ethics is the study of morality. (For more see the article on morality.)

TODO?


everyone_does_it

Everyone Does It

"Everyone does it" is an argument quite often used by simps to justify their unjustifiable actions. It is often used alongside the "just doing my job" argument.

The argument has a valid use, however it is rarely used in the valid way. We humans, as well as other higher organisms, have evolved to mimic the behavior of others because such behavior is tried, others have tested such behavior for us (for example eating a certain plant that might potentially be poisonous) and have survived it, therefore it is likely also safe to do for us. So we have to realize that "everyone does it" is an argument for safety, not for morality. But people nowadays mostly use the argument as an excuse for their immoral behavior, i.e. something that's supposed to make bad things they do "not bad" because "if it was bad, others wouldn't be doing it". That's of course wrong, people do bad things and the argument "everyone does it" helps people do them, for example during the Nazi holocaust this excuse partially allowed some of the greatest atrocities in history. Nowadays during capitalism it is used to excuse taking part unethical practices, e.g. those of corporations.

So if you tell someone "You shouldn't do this because it's bad" and he replies "Well, everyone does it", he's really (usually) saying "I know it's bad but it's safe for me to do".

The effect is of course abused by politicians: once you get a certain number of people moving in a certain shared direction, others will follow just by the need to mimic others. Note that just creating an illusion (using the tricks of marketing) of "everyone doing something" is enough -- that's why you see 150 year old grannies in ads using modern smartphones -- it's to force old people into thinking that other old people are using smartphones so they have to do it as well.

Another potentially valid use of the argument is in the meaning of "everyone does it so I am FORCED to do it as well". For example an employer could argue "I have to abuse my employees otherwise I'll lose the edge on the market and will be defeated by those who continue to abuse their employees". This is very true but it seems like many people don't see or intend this meaning.


evil

Evil

Evil always wins in the end.


exercises

Exercises

Here let be listed exercises for the readers of the wiki. You can allow yourself to as many helpers and resources as you find challenging: with each problem you should either find out you know the solution or learn something new while solving it.

Problems in each category should follow from easiest to most difficult. The listed solutions may not be the only possible solutions, just one of them.

General Knowledge

  1. What is the difference between free software and open source?

Solutions

  1. The free software and open source movements are technically very similar but extremely different in spirit, i.e. while most free software licenses are also open source and vice versa (with small exceptions such as CC0), free software is fundamentally about pursuing user freedom and ethics while open source is a later capitalist fork of free software that removes talk about ethics, aims to exploit free licenses for the benefit of business and is therefore unethical.

Programming

  1. Write a C program that prints out all prime numbers under 1000 as well as the total count of these prime numbers.

Solutions

1:

// Sieve of Eratosthenes algorithm, one possible way to generate prime numbers
#include <stdio.h>
#define N 1000

char primeMap[N];

int main(void)
{
  int primeCount = 0;

  for (int i = 0; i < N; ++i)
    primeMap[i] = 1;

  for (int i = 2; i < N; ++i)
  {
    if (primeMap[i])
    {
      primeCount++;
      printf("%d\n",i);
    }

    int j = i;

    while (1) // mark all multiples of i non-primes
    {
      j += i;

      if (j >= N)
        break;

      primeMap[j] = 0; // can't be a prime
    }
  }

  printf("prime count under %d: %d\n",N,primeCount);

  return 0;
}

Math

Solutions


explicit

Explicit

Explicit is something that's directly expressed; it is the opposite of implicit.


f2p

Free To Play

Free to play (F2P) is a "business model" of predatory proprietary games that's based on the same idea as giving children free candy so that they get into your van so that you can rape them.


facebook

Facebook

"Facebook has no users, it only has useds." --rms

TODO


faggot

Faggot

Faggot is a synonym for gay.


fail_ab

Type A/B Fail

Type A and type B fails are two very common cases of failing to adhere to the LRS politics/philosophy by only a small margin. Most people don't come even close to LRS politically or by their life philosophy -- these are simply general failures. Then there a few who ALMOST adhere to LRS politics and philosophy but fail in an important point, either by being/supporting pseudoleft (type A fail) or being/supporting right (type B fail). The typical cases are following (specific cases may not fully fit these, of course):

Both types are furthermore prone to falling a victim to privacy obsession, productivity obsession, hero worshipping, use of violence, diseases such as distro hopping, consuerism and similar defects.

Type A/B fails are the "great filter" of the rare kind of people who show a great potential for adhering to LRS. It may be due to the modern western culture that forces a right-pseudoleft false dichotomy that even those showing a high degree of non-conformance eventually slip into the trap of being caught by one of the two poles. These two fails seem to be a manifestation of an individual's true motives of self interest which is culturally fueled with great force -- those individuals then try to not conform and support non-mainstream concepts like free culture or sucklessness, but eventually only with the goal of self interest. It seems to be extremely difficult to abandon this goal, much more than simply non-conforming. Maybe it's also the subconscious knowledge that adhering completely to LRS means an extreme loneliness; being type A/B fail means being a part of a minority, but still a having a supportive community, not being completely alone.

However these kinds of people may also pose a hope: if we could educate them and "fix their failure", the LRS community could grow rapidly. If realized, this step could even be seen as the main contribution of LRS -- uniting the misguided rightists and pseudoleftists by pointing out errors in their philosophies (errors that may largely be intentionally forced by the system anyway exactly to create the hostility between the non-conforming, as a means of protecting the system).


                  __
                .'  '.                  
               /      \                   drummyfish
            _.'        '._                    |
___....---''              ''---....___________v___
                               |           |
             normies           |    A/B    | LRS
               FAIL            |    fail   |

fantasy_console

Fantasy Console

Fantasy console, also fantasy computer, is a software platform intended mainly for creating and playing simple games, which imitates parameters, simplicity and look and feel of classic retro consoles such as GameBoy. These consoles are called fantasy because they are not emulators of already existing hardware consoles but rather "dreamed up" platforms, virtual machines made purely in software with artificially added restrictions that a real hardware console might have. These restrictions limit for example the resolution and color depth of the display, number of buttons and sometimes also computational resources.

The motivation behind creating fantasy consoles is normally twofold: firstly the enjoyment of retro games and retro programming, and secondly the immense advantages of simplicity. It is much faster and easier to create a simple game than a full fledged PC game, this attracts many programmers, simple programming is also more enjoyable (fewer bugs and headaches) and simple games have many nice properties such as small size (playability over web), easy embedding or enabling emulator-like features.

Fantasy consoles usually include some kind of simple IDE; a typical mainstream fantasy console both runs and is programmed in a web browser so as to be accessible to normies. They also use some kind of easy scripting language for game programming, e.g. Lua. Even though the games are simple, the code of such a mainstream console is normally bloat, i.e. we are talking about pseudominimalism. Nevertheless some consoles, such as SAF, are truly suckless, free and highly portable (it's not a coincidence that SAF is an official LRS project).

Notable Fantasy Consoles

The following are a few notable fantasy consoles.

name license game lang. parameters comment
CToy zlib C 128x128 suckless
LIKO-12 MIT Lua 192x128
PICO-8 propr. Lua 128x128 4b likely most famous
PixelVision8 MS-PL (FOSS) Lua 256x240 written in C#
Pyxel MIT Python 256x256 4b
SAF CC0 C 64x64 8b LRS, suckless
TIC-80 MIT Lua, JS, ... 240x136 4b paid "pro" version
Uxn MIT Tal very minimal

See Also


faq

Frequently Asked Questions

Not to be confused with fuck or frequently questioned answers.

{ answers by ~drummyfish }

Is this a joke? Are you trolling?

No. Jokes are here.

What the fuck?

See WTF.

How does LRS differ from suckless, KISS, free software and similar types of software?

Sometimes these sets may greatly overlap and LRS is at times just a slightly different angle of looking at the same things, but in short LRS cherry-picks the best of other things and is much greater in scope (it focuses on the big picture of whole society). I have invented LRS as my own take on suckless software and then expanded its scope to encompass not just technology but the whole society -- as I cannot speak on behalf of the whole suckless community (and sometimes disagree with them a lot), I have created my own "fork" and simply set my own definitions without worrying about misinterpreting, misquoting or contradicting someone else. LRS advocates very similar technology to that advocated by suckless, but it furthermore has its specific ideas and areas of focus. The main point is that LRS is derived from an unconditional love of all life rather than some shallow idea such as "productivity". In practice this leads to such things as a high stress put on public domain and legal safety, altruism, selflessness, anti-capitalism, accepting games as desirable type of software, NOT subscribing to the productivity cult, rejecting privacy, cryptocurrencies etc. While suckless is apolitical and its scope is mostly limited to software and its use for "getting job done", LRS speaks not just about technology but about the whole society -- there are two main parts of LRS: less retarded software and less retarded society.

One way to see LRS is as a philosophy that takes only the good out of existing philosophies/movements/ideologies/etc. and adds them to a single unique idealist mix, without including cancer, bullshit, errors, propaganda and other negative phenomena plaguing basically all existing philosophies/movements/ideologies/etc.

Why this obsession with extreme simplicity? Is it because you're too stupid to understand complex stuff?

I used to be the mainstream, complexity embracing programmer. I am in no way saying I'm a genius but I've put a lot of energy into studying computer science full time for many years so I believe I can say I have some understanding of the "complex" stuff. I speak from own experience and also on behalf of others who shared their experience with me that the appreciation of simplicity and realization of its necessity comes after many years of dealing with the complex and deep insight into the field and into the complex connections of that field to society.

You may ask: well then but why it's just you and a few weirdos who see this, why don't most good programmers share your opinions? Because they need to make living or because they simply WANT to make a lot of money and so they do what the system wants them to do. Education in technology (and generally just being exposed to corporate propaganda since birth) is kind of a trap: it teaches you to embrace complexity and when you realize it's not a good thing, it is too late, you already need to pay your student loan, your rent, your mortgage, and the only thing they want you to do is to keep this complexity cult rolling. So people just do what they need to do and many of them just psychologically make themselves believe something they subconsciously know isn't right because that makes their everyday life easier to live. "Everyone does it so it can't be bad, better not even bother thinking about it too much". It's difficult doing something every day that you think is wrong, so you make yourself believe it's right.

It's not that we can't understand the complex. It is that the simpler things we deal with, the more powerful things we can create out of them as the overhead of the accumulated complexity isn't burdening us so much.

Simplicity is crucial not only for the quality of technology, i.e. for example its safety and efficiency, but also for its freedom. The more complex technology becomes, the fewer people can control it. If technology is to serve all people, it has to be simple enough so that as many people as possible can understand it, maintain it, fix it, customize it, improve it. It's not just about being able to understand a complex program, it's also about how much time and energy it takes because time is a price not everyone can afford, even if they have the knowledge of programming. Even if you yourself cannot program, if you are using a simple program and it breaks, you can easily find someone with a basic knowledge of programming who can fix it, unlike with a very complex program whose fix will require a corporation.

Going for the simple technology doesn't necessarily have to mean we have to give up the "nice things" such as computer games or 3D graphics. Many things, such as responsiveness and customizability of programs, would improve. Even if the results won't be so shiny, we can recreate much of what we are used to in a much simpler way. You may now ask: why don't companies do things simply if they can? Because complexity benefits them in creating de facto monopolies, as mentioned above, by reducing the number of people who can tinker with their creations. And also because capitalism pushes towards making things quickly rather than well -- and yes, even non commercial "FOSS" programs are pushed towards this, they still compete and imitate the commercial programs. Already now you can see how technology and society are intertwined in complex ways that all need to be understood before one comes to realize the necessity of simplicity.

How would your ideal society work? Isn't it utopia?

See the article on less retarded society, it contains a detailed FAQ especially on that.

Why the name "less retarded"? If you say you're serious about this, why not a more serious name?

I don't know, this is not so easy to answer because I came up with the name back when the project was smaller in scope and I didn't think about a name too hard: this name was playful, catchy, politically incorrect (keeping SJWs away) and had a kind of reference to suckless, potentially attracting attention of suckless fans. It also has the nice property of being unique, with low probability of name collision with some other existing project, as not many people will want to have the word "retarded" in the name. Overall the name captures the spirit of the philosophy and is very general, allowing it to be applied to new areas without being limited to certain means etc.

Now that the project has evolved a bit the name actually seems to have been a great choice and I'm pretty happy about it, not just for the above mentioned reasons but also because it is NOT some generic boring name that politicians, PR people and other tryhard populists would come up with. In a way it's trying to stimulate thought and make you think (if only by making you ask WHY anyone would choose such a name). Yes, in a way it's a small protest and showing we stay away from the rotten mainstream, but it's definitely NOT an attempt at catching attention at any cost or trying to look like cool rebels -- such mentality goes against our basic principles. Perhaps the greatest reasons for the name is to serve as a test -- truth should prevail no matter what name it is given and we try to test and prove this, or rather maybe prevent succeeding for wrong reasons -- we are not interested in success (which is what mere politicians do); if our ideas are to become accepted, they have to be accepted for the right reasons. And if you refuse to accept truth because you don't like its name, you are retarded and by own ignorance doom yourself to live in a shit society with shit technology.

Who writes this wiki? Can I contribute.

You can only contribute to this wiki if you're a straight white male. Just kidding, you can't contribute even if you're a straight white male.

At the moment it's just me, drummyfish. This started as a collaborative wiki name based wiki but after some disagreements I forked it (everything was practically written by me at that point) and made it my own wiki where I don't have to make any compromises or respect anyone else's opinions. I'm not opposed to the idea of collaboration but I bet we disagree on something in which case I probably don't want to let you edit this. I also resist allowing contributions because with multiple authors the chance of legal complications grows, even if the work is under a free license or waiver (refer to e.g. the situation where some Linux developers were threatening to withdraw their code contribution license). But you can totally fork this wiki, it's public domain.

If you want to contribute to the cause, just create your own website, spread the ideas you liked here -- you may or may not refer to LRS, everything's up to you. Start creating software with LRS philosophy if you can -- together we can help evolve and spread our ideas in a decentralized way, without me or anyone else being an authority, a potential censor. That's the best way forward I think.

Why is it called a wiki when it's written just by one guy? Is it to deceive people into thinking there's a whole movement rather than just one weirdo?

Yes.

No, of course not you dumbo. There is no intention of deception, this project started as a collaborative wiki with multiple contributors, named Based Wiki, however I (drummyfish) forked my contributions (most of the original Wiki) into my own Wiki and renamed it to Less Retarded Wiki because I didn't like the direction of the original wiki. At that point I was still allowing and looking for more contributors, but somehow none of the original people came to contribute and meanwhile I've expanded my LRS Wiki to the point at which I decided it's simply a snapshot of my own views and so I decided to keep it my own project and kept the name that I established, the LRS Wiki. Even though at the moment it's missing the main feature of a wiki, i.e. collaboration of multiple people, it is still a project that most people would likely call a "wiki" naturally (even if only a personal one) due to having all the other features of wikis (separate articles linked via hypertext, non-linear structure etc.) and simply looking like a wiki -- nowadays there are many wikis that are mostly written by a single man (see e.g. small fandom wikis) and people still call them wikis because culturally the term has simply taken a wider meaning, people don't expect a wiki to absolutely necessarily be collaborative and so there is no deception. Additionally I am still open to the idea to possibly allowing contributions, so I'm simply keeping this a wiki, the wiki is in a sense waiting for a larger community to come. Finally the ideas I present here are not just mine but really do reflect existing movements/philosophies with significant numbers of supporters (suckless, free software, ...).

Since it is public domain, can I take this wiki and do anything with it? Even something you don't like, like sell it or rewrite it in a different way?

Yes, you can do anything... well, anything that's not otherwise illegal like falsely claiming authorship (copyright) of the original text. This is not because I care about being credited, I don't (you DON'T have to give me any credit), but because I care about this wiki not being owned by anyone. You can however claim copyright to anything you add to the wiki if you fork it, as that's your original creation.

Why not keep politics out of this Wiki and make it purely about technology?

Firstly for us technological progress is secondary to the primary type of progress in society: the social progress. The goal of our civilization is to provide good conditions for life -- this is social progress and mankind's main goal. Technological progress only serves to achieve this, so technological progress follows from the goals of social progress. So, to define technology we have to first know what it should help achieve in society. And for that we need to talk politics.

Secondly examining any existing subject in depth requires also understanding its context anyway. Politics and technology nowadays are very much intertwined and the politics of a society ultimately significantly affects what its technology looks like (capitalist SW, censorship, bloat, spyware, DRM, ...), what goals it serves (consumerism, productivity, control, war, peace, ...) and how it is developed (COCs, free software, ...), so studying technology ultimately requires understanding politics around it. I hate arguing about politics, sometimes it literally make me suicidal, but it is inevitable, we have to specify real-life goals clearly if we're to create good technology. Political goals guide us in making important design decisions about features, tradeoffs and other attributes of technology.

Of course you can fork this wiki and try to remove politics from it, but I think it won't be possible to just keep the technology part alone so that it would still make sense, most things will be left without justification and explanation.

What is the political direction of LRS then?

In three words basically anarcho pacifist communism, however the word culture may be more appropriate than "politics" here as we aim for removing traditional systems of government based on power and enforcing complex laws, there shall be no politicians in today's sense in our society. For more details see the article about LRS itself.

Why do you blame everything on capitalism when most of the issues you talk about, like propaganda, surveillance, exploitation of the poor and general abuse of power, appeared also under practically any other systems we've seen in history?

This is a good point, we talk about capitalism simply because it is the system of today's world and an immediate threat that needs to be addressed, however we always try to stress that the root issue lies deeper: it is competition that we see as causing all major evil. Competition between people is what always caused the main issues of a society, no matter whether the system at the time was called capitalism, feudalism or pseudosocialism. While historically competition and conflict between people was mostly forced by the nature, nowadays we've conquered technology to a degree at which we could practically eliminate competition, however we choose to artificially preserve it via capitalism, the glorification of competition, and we see this as an extremely wrong direction, hence we put stress on opposing capitalism, i.e. artificial prolonging of competition.

How is this different from Wikipedia?

In many ways. Our wiki is better e.g. by being more free (completely public domain, no fair use proprietary images etc.), less bloated, better accessible, not infected by pseudoleftist fascism and censorship (we only censor absolutely necessary things, e.g. copyrighted things or things that would immediately put us in jail, though we still say many things that may get us in jail), we have articles that are better readable etc.

WTF I am offended, is this a nazi site? Are you racist/Xphobic? Do you love Hitler?!?!

We're not fascists, we're in fact the exact opposite: our aim is to create technology that benefits everyone equally without any discrimination. I (drummyfish) am personally a pacifist anarchist, I love all living beings and believe in absolute social equality of all life forms. We invite and welcome everyone here, be it gays, communists, rightists, trannies, pedophiles or murderers, we love everyone equally, even you and Hitler.

Note that the fact that we love someone (e.g. Hitler) does NOT mean we embrace his ideas (e.g. Nazism) or even that we e.g. like the way he looks. You may hear us say someone is a stupid ugly fascist, but even such individuals are living beings we love.

What we do NOT engage in is political correctness, censorship, offended culture, identity politics and pseudoleftism. We do NOT support fascist groups such as feminists and LGBT and we will NOT practice bullying and codes of conducts. We do not pretend there aren't any differences between people and we will make jokes that make you feel offended.

Why do you use the nigger word so much?

To counter its censorship, we mustn't be afraid of words. The more they censor something, the more I am going to uncensor it. They have to learn that the only way to make me not say that word so often is to stop censoring it, so to their action of censorship I produce a reaction they dislike. That's basically how you train a dog. (Please don't ask who "they" are, it's pretty obvious).

It also has the nice side effect of making this less likely to be used by corporations and SJWs.

How can you say you love all living beings and use offensive language at the same time?

The culture of being offended is bullshit, it is a pseudoleftist (fascist) invention that serves as a weapon to justify censorship, canceling and bullying of people. Since I love all people, I don't support any weapons against anyone (not even against people I dislike or disagree with). People are offended by language because they're taught to be offended by it by the propaganda, I am helping them unlearn it.

But how can you so pretentiously preach "absolute love" and then say you hate capitalists, fascists, bloat etc.?

OK, firstly we do NOT love everything, we do NOT advocate against hate itself, only against hate of living beings (note we say we love everyone, not everything). Hating other things than living beings, such as some bad ideas or malicious objects, is totally acceptable, there's no problem with it. We in fact think hate of some concepts is necessary for finding better ways.

Now when it comes to "hating" people, there's an important distinction to be stressed: we never hate a living being as such, we may only hate their properties. So when we say we hate someone, it's merely a matter of language convenience -- saying we hate someone never means we hate a person as such, but only some thing about that person, for example his opinions, his work, actions, behavior or even appearance. I can hear you ask: what's the difference? The difference is we'll never try to eliminate a living being or cause it suffering because we love it, we may only try to change, in non-violent ways, their attributes we find wrong (which we hate): for example we may try to educate the person, point out errors in his arguments, give him advice, and if that doesn't work we may simply choose to avoid his presence. But we will never target hate against him.

And yeah, of course sometimes we make jokes and sarcastic comments, it is relied on your ability to recognize those yourself. We see it as retarded and a great insult to intelligence to put disclaimers on jokes, that's really the worst thing you can do to a joke.

So you really "love" everyone, even dicks like Trump, school shooters, instagram influencers etc.?

Yes, but it may need an elaboration. There are many different kinds of love: love of a sexual partner, love of a parent, love of a pet, love of a hobby, love of nature etc. Obviously we can't love everyone with the same kind of love we have e.g. for our life partner, that's impossible if we've actually never even seen most people who live on this planet. The love we are talking about -- our universal love of everyone -- is an unconditional love of life itself. Being alive is a miracle, it's beautiful, and as living beings we feel a sense of connection with all other living beings in this universe who were for some reason chosen to experience this rare miracle as well -- we know what it feels like to live and we know other living beings experience this special, mysterious privilege too, though for a limited time. This is the most basic kind of love, an empathy, the happiness of seeing someone else live. It is sacred, there's nothing more pure in this universe than feeling this empathy, it works without language, without science, without explanation. While not all living beings are capable of this love (a virus probably won't feel any empathy), we believe all humans have this love in them, even if it's being suppressed by their environment that often forces them compete, hate, even kill. Our goal is to awaken this love in everyone as we believe it's the only way to achieve a truly happy coexistence of us, living beings.

I dislike this wiki, our teacher taught us that global variables are bad and that OOP is good.

This is not a question you dummy. Have you even read the title of this page? Anyway, your teacher is stupid, he is, very likely unknowingly, just spreading the capitalist propaganda. He probably believes what he's saying but he's wrong.

Lol you've got this fact wrong and you misunderstand this and this topic, you've got bugs in code, your writing sucks etc. How dare you write about things you have no clue about?

I want a public domain encyclopedia that includes topics of new technology, and also one which doesn't literally make me want to kill myself due to inserted propaganda of evil etc. Since this supposedly modern society failed to produce even a single such encyclopedia and since every idiot on this planet wants to keep his copyright on everything he writes, I am forced to write the encyclopedia myself, even for the price of making mistakes. No, US public domain doesn't count as world wide public domain. Even without copyright there are still so called moral rights etc. Blame this society for not allowing even a tiny bit of information to slip into public domain. Writing my own encyclopedia is literally the best I can do in the situation I am in. Nothing is perfect, I still believe this can be helpful to someone. You shouldn't take facts from a random website for granted. If you wanna help me correct errors, email me.

Why is this shit so poorly written and inconsistent (typos, incorrect comma placement, inconsistent acronym case, weird grammar etc.)?

Mainly for these reasons:

How can you use CC0 if you, as anarchists, reject laws and intellectual property?

We use it to remove law from our project, it's kind of like using a weapon to destroy itself. Using a license such as GFDL would mean we're keeping our copyright and are willing to execute enforcement of intellectual property laws, however using a CC0 waiver means we GIVE UP all lawful exclusive rights that have been forced on us. This has no negative effects: if law applies, then we use it to remove itself, and if it doesn't, then nothing happens. To those that acknowledge the reality of the fact that adapting proprietary information can lead to being bullied by the state we give a guarantee this won't happen, and others simply don't have to care.

A simple analogy is this: a law is so fucked up nowadays that it forces us to point a gun at anyone by default when we create something. It's as if they literally put a gun in our hand and force point it at someone. We decide to drop that weapon, not merely promise to not shoot.

What software does this wiki use?

Git, the articles are written in markdown and converted to HTML with a simple script.

I don't want my name associated with this, can you remove a reference to myself or my software from your wiki?

No.

How can you think you are the smartest man in universe? How can you say your opinions are facts? Isn't it egoistic?

I don't think I am the smartest at all, in fact I am pretty dumb, I just have a gift of being completely immune to propaganda and seeing the world clearly, and I happen to be in circumstances under which I can do what others can't; for example as I have no friends and no one likes me, I can write and create freely, without self censorship due to fear of losing my job, offending my friends etc. I can write close to what is the absolute truth thanks to all this. I am also super autistic in that I enjoy just thinking 24/7 about programming and stuff instead of thinking about money and watching ads, which compensates for my dumbness a bit.

How do I know my opinions are facts? Experience. How do we discover facts? There is never a 100% certainty of anything, even of mathematical proofs, we may only ever have a great statistical confidence and beliefs so strong we call them facts. Just as by walking 1000 times against a wall you learn you won't walk through, I have over the decades learned I am correct in what I say and that everyone else is simply a monkey incapable of thinking. I used to be the kind of guy "open to discussion and opinions of others", I was giving this approach a chance over and over for about 30 years, I had more than enough patience, but it didn't work, the world has failed. People are absolutely stupid, you can physically show them something, give them tons of evidence and proofs, they won't believe what is literally in front of their eyes -- no, not even intellectuals, people in universities etc. Talking to others and listening to them is a complete waste of life, it's like trying to talk to potatoes or rocks, I might just as well be punching air all day or trying to eat dirt. The best I found I can do now is kind of talk to myself here, record my brain dump in hopes someone will once understand. I really don't know what else to do.

There is nothing egoistic about being special in something, everyone has a talent for something, egoism is about being fascist, preoccupied with oneself and focusing on self benefit, achieving fame, recognition etc., which I try my best to never do. Maybe I slip sometimes as an imperfect man I am, I make mistakes, I do stupid stuff, but I honestly just want the good of everyone without putting myself in the front.

Are you the only one in the world who is not affected by propaganda?

It definitely seems so.

How does it feel to be the only one on this planet to see the undistorted truth of reality?

Pretty lonely and depressing.

Are you a crank?

Depending on exact definition the answer is either "no" or "yes and it's a good thing".

Are you retarded?

:( Maybe, but even stupid people can sometimes have smart ideas.


fascism

Fascism

Fascist groups are subgroups of society that strongly pursue self interest on the detriment of others (those who are not part of said group). Fascism is a rightist, competitive tendency; fascists aim to make themselves as strong, as powerful and as rich as possible, i.e. to weaken and possibly eliminate competing groups, to have power over them, enslave them and to seize their resources. The means of their operation are almost exclusively evil, including violence, bullying, wars, propaganda, eye for an eye, slavery etc.

A few examples of fascist groups are corporations, nations, NSDAP (Nazis), LGBT, feminists, Antifa, KKK, Marxists and, of course, the infamous Italian fascist party of Benito Mussolini.

Fascism is always bad and we have to aim towards eliminating it (that is eliminating fascism, NOT fascists -- fascists are people and living beings to whom we wish no harm). However here comes a great warning: in eliminating fascism be extremely careful to not become a fascist yourself. We purposefully do NOT advice to fight fascism as fight implies violence, the tool of fascism. Elimination of fascism has to be done in a non-violent way. Sadly, generation after generation keeps repeating the same mistake over and over: they keep opposing fascism by fascist means, eventually taking the oppressors place and becoming the new oppressor, only to again be dethroned by the new generation. This has happened e.g. with feminism and other pseudoleftist movements. This is an endless cycle of stupidity but, more importantly, endless suffering of people. This cycle needs to be ended. We must choose not the easy way of violence, but the difficult way of non-violent rejection which includes loving the enemy as we love ourselves. Fascism is all about loving one's own group while hating the enemy groups -- if we can achieve loving all groups of people, even fascists themselves, fascism will have been by definition eliminated.

Fear is the fuel of fascism. When fear of an individual reaches certain level -- which is different for everyone -- he turns to fascism. Even that who is normally anti fascist has a breaking point, under extreme pressure of fear one starts to seek purely selfish goals. This is why e.g. capitalism fuels fear culture: it makes people fascists which is a prerequisite for becoming a capitalist. When "leaders" of nations need to lead war, they start spreading propaganda of fear so as to turn people into fascists that easily become soldiers. This is why education is important in eliminating fascism: it is important to e.g. show that we need not be afraid of people of other cultures, of sharing information and resources etc. The bullshit of fear propaganda has to be exposed.


fascist

Fascist

See fascism.


fear_culture

Fear Culture

A frightened man gives up his freedom for the promise of safety.

TODO


feminism

Feminism

Sufficiently advanced stupidity is indistinguishable from feminism. --drummyfish's law

Feminism, also feminazism, is a fascist terrorist pseudoleftist movement aiming for establishing female as the superior gender, for social revenge on men and gaining political power, e.g. that over language. Similarly to LGBT, feminism is violent, toxic and harmful, based on brainwashing, mass hysteria, bullying (e.g. the metoo campaign) and propaganda.

LMAO, this just sums up the feminist ways: a supposed woman writer who won 1 million euro prize turned out to actually be three men writers, see Carmen Mola :)

If anything's clear, then that feminism is not at all about gender equality but about hatred towards men. Firstly feminism is not called gender equality movement but feminism, i.e. for-female, and as we know, name plays a huge role. To a feminist man is what a jew was to the Nazi; the whole story is repeated again, we have yet again not learned a bit from our history. Indeed, women have historically been oppressed and needed support, but once women reach social equality -- which has basically already happened a long time ago now -- feminist movement will, if only by social inertia, keep pursuing more advantages for women (what else should a movement called feminism do?), i.e. at this point the new goal has already become female superiority. In the age of capital no one is going to just dissolve a movement because it has already reached its goal, such a movement present political capital one will simply not throw out of window, so feminists will forever keep saying they're being oppressed and will forever keep inventing new bullshit issues to keep fighting. Note for example that feminists care about things such as wage gap but of course absolutely don't give a damn about opposite direction inequality, such as men dying on average much younger than women etc. -- feminism cares about women, not equality. And of course, when men establish "men rights" movements, suddenly feminists see those as "fascist", "toxic" and "violent" and try to destroy such movements.

Part of the success of feminism is also capitalism -- women with priviledges, e.g. those of not having to work as much as men, are not accepted under capitalism; everyone has to be exploited as much as possible, everyone has to be a work slave. Therefore capitalist propaganda promotes ideas such as "women not having to work is oppression by men and something a woman should be ashamed of", which is of course laughable, but with enough brainwashing anything can be established, even the most ridiculous and obvious bullshit.

Apparently in Korea feminists already practice segregation, they separate parking spots for men and women so as to prevent women bumping into men or meeting a man late at night because allegedly men are more aggressive and dangerous. Now this is pretty ridiculous, this is exactly the same as if they separated e.g. parking lots for black and white people because black people are statistically more often involved in crime, you wouldn't want to meet them at night. So, do we still want to pretend feminists are not fascist?


femoid

Femoid

See woman.


fight_culture

Fight Culture

Fight culture is the harmful mindset of seeing any endeavor as a fight against something. Even such causes as aiming for establishment of peace are seen as fighting the people who are against peace, which is funny but also sad. Fight culture keeps, just by the constant repetition of the word fight, a subconscious validation of violence as justified and necessary means for achieving any goal. Fight culture is to a great degree the culture of capitalist society (of course not exclusively), the environment of extreme competition and hostility.

We, of course, see fight culture as inherently undesirable for a good society as that needs to be based on peace, love and collaboration, not competition. For this reasons we never say we "fight" anything, we rather aim for goals, look for solutions, educate and sometimes reject, refuse and oppose bad concepts (e.g. fight culture itself).


finished

Finished

A finished project is completed, working and doesn't need regular maintenance, it serves its users and doesn't put any more significant burden of development cost on anyone. A finished project is not necessarily perfect and bugless, it is typically just working as intended, greatly stable, usable, well optimized and good enough. In a sane society (such as lrs) when we start a project, we are deciding to invest some effort into it with the promise of one day finishing it and then only benefiting from it for evermore; however under capitalist's update culture nothing really gets finished, projects are started with the goal of developing them forever so as to enslave more and more people (or, as capitalists put it, "create jobs" for them), which is extremely harmful. Finished project often have the version number 1.0, however under capitalist update culture this just a checkpoint towards implementing basic features which doesn't really imply the project is finished (after 1.0 they simply aim for 2.0, 3.0 etc.). Always aim for projects that will be finished (even if potentially not by you); sure, even in a good society SOME projects may be "perpetual" in nature -- for example an encyclopedia that's updated every 5 years with new knowledge and discoveries -- however this should only be the case where NECESSARY and the negative effects of this perpetual nature should be minimized (for example with the encyclopedia we should make the update span as large as possible, let's say 5 years as opposed to 1 year, and we should yield a nice and tidy release after every update).

Examples of projects that have been finished are:

How to make greatly finishable projects?


firmware

Firmware

Firmware is a type of very basic software that's usually preinstalled on a device from factory and serves to provide the most essential functionality of the device. On simple devices, like mp3 players or remote controls, firmware may be all that's ever needed for the device's functioning, while on more complex ones, such as personal computers, firmware (e.g. BIOS or UEFI) allows basic configuration and installation of more complex software (such as an operating system) and possibly provides functions that the installed software can use. Firmware is normally not meant to be rewritten by the user and is installed in some kind of memory that's not very easy to rewrite, it may even be hard-wired in which case it becomes something on the very boundary of software and hardware.


fixed_point

Fixed Point

Fixed point arithmetic is a simple and often good enough method of computer representation of fractional numbers (i.e. numbers with higher precision than integers, e.g. 4.03), as opposed to floating point which is a more complicated way of doing this which in most cases we consider a worse, bloated alternative. Probably in 99% cases when you think you need floating point, fixed point will do just fine.

Fixed point has at least these advantages over floating point:

How It Works

Fixed point uses a fixed (hence the name) number of digits (bits in binary) for the integer part and the rest for the fractional part (whereas floating point's fractional part varies in size). I.e. we split the binary representation of the number into two parts (integer and fractional) by IMAGINING a radix point at some place in the binary representation. That's basically it. Fixed point therefore spaces numbers uniformly, as opposed to floating point whose spacing of numbers is non-uniform.

So, we can just use an integer data type as a fixed point data type, there is no need for libraries or special hardware support. We can also perform operations such as addition the same way as with integers. For example if we have a binary integer number represented as 00001001, 9 in decimal, we may say we'll be considering a radix point after let's say the sixth place, i.e. we get 000010.01 which we interpret as 2.25 (2^2 + 2^(-2)). The binary value we store in a variable is the same (as the radix point is only imagined), we only INTERPRET it differently.

We may look at it this way: we still use integers but we use them to count smaller fractions than 1. For example in a 3D game where our basic spatial unit is 1 meter our variables may rather contain the number of centimeters (however in practice we should use powers of two, so rather 1/128ths of a meter). In the example in previous paragraph we count 1/4ths (we say our scaling factor is 1/4), so actually the number represented as 00000100 is what in floating point we'd write as 1.0 (00000100 is 4 and 4 * 1/4 = 1), while 00000001 means 0.25.

This has just one consequence: we have to normalize results of multiplication and division (addition and subtraction work just as with integers, we can normally use the + and - operators). I.e. when multiplying, we have to divide the result by the inverse of the fractions we're counting, i.e. by 4 in our case (1/(1/4) = 4). Similarly when dividing, we need to MULTIPLY the result by this number. This is because we are using fractions as our units and when we multiply two numbers in those units, the units multiply as well, i.e. in our case multiplying two numbers that count 1/4ths give a result that counts 1/16ths, we need to divide this by 4 to get the number of 1/4ths back again (this works the same as e.g. units in physics, multiplying number of meters by number of meters gives meters squared). For example the following integer multiplication:

00001000 * 00000010 = 00010000 (8 * 2 = 16)

in our system has to be normalized like this:

(000010.00 * 000000.10) / 4 = 000001.00 (2.0 * 0.5 = 1.0)

SIDE NOTE: in practice you may see division replaced by the shift operator (instead of /4 you'll see >> 2).

With this normalization we also have to think about how to bracket expressions to prevent rounding errors and overflows, for example instead of (x / y) * 4 we may want to write (x * 4) / y; imagine e.g. x being 00000010 (0.5) and y being 00000100 (1.0), the former would result in 0 (incorrect, rounding error) while the latter correctly results in 0.5. The bracketing depends on what values you expect to be in the variables so it can't really be done automatically by a compiler or library (well, it might probably be somehow handled at runtime, but of course, that will be slower). There are also ways to prevent overflows e.g. with clever bit hacks.

The normalization and bracketing are basically the only things you have to think about, apart from this everything works as with integers. Remember that this all also works with negative number in two's complement, so you can use a signed integer type without any extra trouble.

Remember to always use a power of two scaling factor -- this is crucial for performance. I.e. you want to count 1/2th, 1/4th, 1/8ths etc., but NOT 1/10ths, as might be tempting. Why are power of two good here? Because computers work in binary and so the normalization operations with powers of two (division and multiplication by the scaling factor) can easily be optimized by the compiler to a mere bit shift, an operation much faster than multiplication or division.

Code Example

For start let's compare basic arithmetic operations in C written with floating point and the same code written with fixed point. Consider the floating point code first:

float 
  a = 21,
  b = 3.0 / 4.0,
  c = -10.0 / 3.0;
  
a = a * b;   // multiplication
a += c;      // addition
a /= b;      // division
a -= 10;     // subtraction
a /= 3;      // division
  
printf("%f\n",a);

Equivalent code with fixed point may look as follows:

#define UNIT 1024       // our "1.0" value

int 
  a = 21 * UNIT,
  b = (3 * UNIT) / 4,   // note the brackets, (3 / 4) * UNIT would give 0
  c = (-10 * UNIT) / 3;

a = (a * b) / UNIT;     // multiplication, we have to normalize
a += c;                 // addition, no normalization needed
a = (a * UNIT) / b;     // division, normalization needed, note the brackets
a -= 10 * UNIT;         // subtraction
a /= 3;                 // division by a number NOT in UNITs, no normalization needed
  
printf("%d.%d%d%d\n",   // writing a nice printing function is left as an exercise :)
  a / UNIT,
  ((a * 10) / UNIT) % 10,
  ((a * 100) / UNIT) % 10,
  ((a * 1000) / UNIT) % 10);

These examples output 2.185185 and 2.184, respectively.

Now consider another example: a simple C program using fixed point with 10 fractional bits, computing square roots of numbers from 0 to 10.

#include <stdio.h>

typedef int Fixed;

#define UNIT_FRACTIONS 1024 // 10 fractional bits, 2^10 = 1024

#define INT_TO_FIXED(x) ((x) * UNIT_FRACTIONS)

Fixed fixedSqrt(Fixed x)
{
  // stupid brute force square root

  int previousError = -1;
  
  for (int test = 0; test <= x; ++test)
  {
    int error = x - (test * test) / UNIT_FRACTIONS;

    if (error == 0)
      return test;
    else if (error < 0)
      error *= -1;

    if (previousError > 0 && error > previousError)
      return test - 1;

    previousError = error;
  }

  return 0;
}

void fixedPrint(Fixed x)
{
  printf("%d.%03d",x / UNIT_FRACTIONS,
    ((x % UNIT_FRACTIONS) * 1000) / UNIT_FRACTIONS);
}

int main(void)
{
  for (int i = 0; i <= 10; ++i)
  {
    printf("%d: ",i);
    
    fixedPrint(fixedSqrt(INT_TO_FIXED(i)));
    
    putchar('\n');
  }
  
  return 0;
}

The output is:

0: 0.000
1: 1.000
2: 1.414
3: 1.732
4: 2.000
5: 2.236
6: 2.449
7: 2.645
8: 2.828
9: 3.000
10: 3.162

fizzbuzz

FizzBuzz

TODO

#include <stdio.h>

int main(void)
{
  for (int i = 1; i <= 100; ++i)
    switch ((i % 3 == 0) + (i % 5 == 0) * 2)
    {
      case 1: printf("Fizz\n"); break;
      case 2: printf("Buzz\n"); break;
      case 3: printf("FizzBuzz\n"); break;
      default: printf("%d\n",i); break;
    }

  return 0;
}

float

Floating Point

In programming floating point (colloquially just float) is a way of representing fractional numbers (such as 5.13) and approximating real numbers (i.e. numbers with higher than integer precision), which is a bit more complex than simpler methods for doing so (such as fixed point). The core idea of it is to use a radix ("decimal") point that's not fixed but can move around so as to allow representation of both very small and very big values. Nowadays floating point is the standard way of approximating real numbers in computers (floating point types are called real in some programming languages, even though they represent only rational numbers, floats can't e.g. represent pi exactly), basically all of the popular programming languages have a floating point data type that adheres to the IEEE 754 standard, all personal computers also have the floating point hardware unit (FPU) and so it is widely used in all modern programs. However most of the time a simpler representation of fractional numbers, such as the mentioned fixed point, suffices, and weaker computers (e.g. embedded) may lack the hardware support so floating point operations are emulated in software and therefore slow -- remember, float rhymes with bloat. Prefer fixed point.

Floating point is tricky, it works most of the time but a danger lies in programmers relying on this kind of magic too much, some new generation programmers may not even be very aware of how float works. Even though the principle is not so hard, the emergent complexity of the math is really complex. One floating point expression may evaluate differently on different systems, e.g. due to different rounding settings. One possible pitfall is working with big and small numbers at the same time -- due to differing precision at different scales small values simply get lost when mixed with big numbers and sometimes this has to be worked around with tricks (see e.g. this devlog of The Witness where a float time variable sent into shader is periodically reset so as to not grow too large and cause the mentioned issue). Another famous trickiness of float is that you shouldn't really be comparing them for equality with a normal == operator as small rounding errors may make even mathematically equal expressions unequal (i.e. you should use some range comparison instead).

And there is more: floating point behavior really depends on the language you're using (and possibly even compiler, its setting etc.) and it may not be always completely defined, leading to possible nondeterministic behavior which can cause real trouble e.g. in physics engines.

{ Really as I'm now getting down the float rabbit hole I'm seeing what a huge mess it all is, I'm not nearly an expert on this so maybe I've written some BS here, which just confirms how messy floats are. Anyway, from the articles I'm reading even being an expert on this issue doesn't seem to guarantee a complete understanding of it :) Just avoid floats if you can. ~drummyfish }

Is floating point literal evil? Well, of course not, but it is extremely overused. You may need it for precise scientific simulations, e.g. numerical integration, but as our small3dlib shows, you can comfortably do even 3D rendering without it. So always consider whether you REALLY need float. You mostly do NOT need it.

Simple example of avoiding floating point: many noobs think that if they e.g. need to multiply some integer x by let's say 2.34 they have to use floating point. This is of course false and just proves most retarddevs don't know elementary school math. Multiplying x by 2.34 is the same as (x * 234) / 100, which we can optimize to an approximately equal division by power of two as (x * 2396) / 1024. Indeed, given e.g. x = 56 we get the same integer result 131 in both cases, the latter just completely avoiding floating point.

How It Works

The very basic idea is following: we have digits in memory and in addition we have a position of the radix point among these digits, i.e. both digits and position of the radix point can change. The fact that the radix point can move is reflected in the name floating point. In the end any number stored in float can be written with a finite number of digits with a radix point, e.g. 12.34. Notice that any such number can also always be written as a simple fraction of two integers (e.g. 12.34 = 1 * 10 + 2 * 1 + 3 * 1/10 + 4 * 1/100 = 617/50), i.e. any such number is always a rational number. This is why we say that floats represent fractional numbers and not true real numbers (real numbers such as pi, e or square root of 2 can only be approximated).

More precisely floats represent numbers by representing two main parts: the base -- actual encoded digits, called mantissa (or significand etc.) -- and the position of the radix point. The position of radix point is called the exponent because mathematically the floating point works similarly to the scientific notation of extreme numbers that use exponentiation. For example instead of writing 0.0000123 scientists write 123 * 10^-7 -- here 123 would be the mantissa and -7 the exponent.

Though various numeric bases can be used, in computers we normally use base 2, so let's consider it from now on. So our numbers will be of format:

mantissa * 2^exponent

Note that besides mantissa and exponent there may also be other parts, typically there is also a sign bit that says whether the number is positive or negative.

Let's now consider an extremely simple floating point format based on the above. Keep in mind this is an EXTREMELY NAIVE inefficient format that wastes values. We won't consider negative numbers. We will use 6 bits for our numbers:

So for example the binary representation 110011 stores mantissa 110 (6) and exponent 011 (3), so the number it represents is 6 * 2^3 = 48. Similarly 001101 represents 1 * 2^-3 = 1/8 = 0.125.

Note a few things: firstly our format is shit because some numbers have multiple representations, e.g. 0 can be represented as 000000, 000001, 000010, 000011 etc., in fact we have 8 zeros! That's unforgivable and formats used in practice address this (usually by prepending an implicit 1 to mantissa).

Secondly notice the non-uniform distribution of our numbers: while we have a nice resolution close to 0 (we can represent 1/16, 2/16, 3/16, ...), our resolution in high numbers is low (the highest number we can represent is 56 but the second highest is 48, we can NOT represent e.g. 50 exactly). Realize that obviously with 6 bits we can still represent only 64 numbers at most! So float is NOT a magical way to get more numbers, with integers on 6 bits we can represent numbers from 0 to 63 spaced exactly by 1 and with our floating point we can represent numbers spaced as close as 1/16th but only in the region near 0, we pay the price of having big gaps in higher numbers.

Also notice that thing like simple addition of numbers become more difficult and time consuming, you have to include conversions and rounding -- while with fixed point addition is a single machine instruction, same as integer addition, here with software implementation we might end up with dozens of instructions (specialized hardware can perform addition fast but still, not all computer have that hardware).

Rounding errors will appear and accumulate during computations: imagine the operation 48 + 1/8. Both numbers can be represented in our system but not the result (48.125). We have to round the result and end up with 48 again. Imagine you perform 64 such additions in succession (e.g. in a loop): mathematically the result should be 48 + 64 * 1/8 = 56, which is a result we can represent in our system, but we will nevertheless get the wrong result (48) due to rounding errors in each addition. So the behavior of float can be non intuitive and dangerous, at least for those who don't know how it works.

Standard Float Format: IEEE 754

IEEE 754 is THE standard that basically all computers use for floating point nowadays -- it specifies the exact representation of floating point numbers as well as rounding rules, required operations applications should implement etc. However note that the standard is kind of shitty -- even if we want to use floating point numbers there exist better ways such as posits that outperform this standard. Nevertheless IEEE 754 has been established in the industry to the point that it's unlikely to go anytime soon. So it's good to know how it works.

Numbers in this standard are signed, have positive and negative zero (oops), can represent plus and minus infinity and different NaNs (not a number). In fact there are thousands to billions of different NaNs which are basically wasted values. These inefficiencies are addressed by the mentioned posits.

Briefly the representation is following (hold on to your chair): leftmost bit is the sign bit, then exponent follows (the number of bits depends on the specific format), the rest of bits is mantissa. In mantissa implicit 1. is considered (except when exponent is all 0s), i.e. we "imagine" 1. in front of the mantissa bits but this 1 is not physically stored. Exponent is in so called biased format, i.e. we have to subtract half (rounded down) of the maximum possible value to get the real value (e.g. if we have 8 bits for exponent and the directly stored value is 120, we have to subtract 255 / 2 = 127 to get the real exponent value, in this case we get -7). However two values of exponent have special meaning; all 0s signify so called denormalized (also subnormal) number in which we consider exponent to be that which is otherwise lowest possible (e.g. -126 in case of 8 bit exponent) but we do NOT consider the implicit 1 in front of mantissa (we instead consider 0.), i.e. this allows storing zero (positive and negative) and very small numbers. All 1s in exponent signify either infinity (positive and negative) in case mantissa is all 0s, or a NaN otherwise -- considering here we have the whole mantissa plus sign bit unused, we actually have many different NaNs (WTF), but usually we only distinguish two kinds of NaNs: quiet (qNaN) and signaling (sNaN, throws and exception) that are distinguished by the leftmost bit in mantissa (1 for qNaN, 0 for sNaN).

The standard specifies many formats that are either binary or decimal and use various numbers of bits. The most relevant ones are the following:

name M bits E bits smallest and biggest number precision <= 1 up to
binary16 (half precision) 10 5 2^(-24), 65504 2048
binary32 (single precision, float) 23 8 2^(-149), 2127 * (2 - 2^-23) ~= 3 * 10^38 16777216
binary64 (double precision, double) 52 11 2^(-1074), ~10^308 9007199254740992
binary128 (quadruple precision) 112 15 2^(-16494), ~10^4932 ~10^34

Example? Let's say we have float (binary34) value 11000000111100000000000000000000: first bit (sign) is 1 so the number is negative. Then we have 8 bits of exponent: 10000001 (129) which converted from the biased format (subtracting 127) gives exponent value of 2. Then mantissa bits follow: 11100000000000000000000. As we're dealing with a normal number (exponent bits are neither all 1s nor all 0s), we have to imagine the implicit 1. in front of mantissa, i.e. our actual mantissa is 1.11100000000000000000000 = 1.875. The final number is therefore -1 * 1.875 * 2^2 = -7.5.

See Also


floss

FLOSS

FLOSS (free libre and open source) is basically FOSS.


football

Football

Not to be confused with any American pseudosport.

Football is one of the most famous sport games in which two teams face each other and try to score goals by kicking an inflated ball. It is one of the best sports not only because it is genuinely fun to play and watch but also because of its essentially simple rules, accessibility (not for rich only, all that's really needed is something resembling a ball) and relatively low discrimination -- basically anyone can play it, unlike for example basketball in which height is key; in amateur football even fat people can take part (they are usually assigned the role of a goalkeeper). Idiots call football soccer.

We, LRS, highly value football, as it's a very KISS sport that can be played by anyone anywhere without needing expensive equipment. It is the sport of the people, very popular in poor parts of the world.

Football can be implemented as a video game or inspire a game mode -- this has been done e.g. in Xonotic (the Nexball mode) or SuperTuxKart. There is a popular mainstream proprietary video game called Rocket League in which cars play football (INB4 zoomers start calling football "Rocket League with people").

Rules

As football is so widely played on all levels and all around the world, there are many versions and rule sets of different games in the football family, and it can sometimes be difficult to even say what classifies as football and what's a different sport. There are games like futsal and beach football that may or may not be seen as a different sport. The most official rules of what we'd call football are probably those known as Laws of the Game governed by International Football Association Board (IFAB) -- these rules are used e.g. by FIFA, various national competitions etc. Some organizations, e.g. some in the US, use different but usually similar rules. We needn't say these high level rules are pretty complex -- Laws of the Game have over 200 pages and talk not just about the mechanics of the game but also things such as allowed advertising, political and religious symbolism, referee behavior etc.

Here is a simple ASCII rendering of the football pitch:

  C1_________________________________________C2
   |                    :                    |
   |                    :                    |
   |........            :            ........|
   |       :            :            :       |
   |....   :            :            :   ....|
 __|   :   :            :            :   :   |__
|G :   :   :            :            :   :   : G|
|1 :   :   :            O            :   :   : 2|
|__:   :   :            :            :   :   :__|
   |...:   :            :            :   :...|
   |       :            :            :       |
   |.......:            :            :.......|
   |                    :                    |
   |                    :                    |
   |____________________:____________________|
  C3                                         C4
 

In amateur games simpler rules are used -- a sum up of such rules follows:


fork

Fork

Fork is a branch that splits from the main branch of a project and continues to develop in a different direction as a separate version of that project, possibly becoming a completely new one. This may happen with any "intellectual work" or idea such as software, movement, theory, literary universe, religion or, for example, a database. Forks may later be merged back into the original project or continue and diverge far away, forks of different projects may also combine into a single project as well.

For example the Android operating system and Linux-libre kernel have both been forked from Linux. Linux distributions highly utilize forking, e.g. Devuan or Ubuntu and Mint are forked from Debian. Free software movement was forked into open source, free culture and suckless, and suckless was more or less forked into LRS. Wikipedia also has forks such as Metapedia. Memes evolve a lot on the basis of forking.

Forking takes advantage of the ability to freely duplicate information, i.e. if someone sees how to improve an intellectual work or use it in a novel way, he may simply copy it and start developing it in a new diverging direction while the original continues to exist and going its own way. That is unless copying and modification of information is artificially prevented, e.g. by intellectual property laws or purposeful obscurity standing in the way of remixing. For this reason forking is very popular in free culture and free software where it is allowed both legally and practically -- in fact it plays a very important role there.

In software development temporary forking is used for implementing individual features which, when completed, are merged back into the main branch. This is called branching and is supported by version control systems such as git.

There are two main kinds of forks:

Is forking good? Yes, to create anything new it is basically necessary to build on top of someone else's work, stand on someone else's shoulders. Some people criticize too much forking; for example some cry about Linux distro fragmentation, they say there are too many of distros and that people should rather focus their energy on creating a single or at least fewer good operating systems, i.e. that forking is kind of "wasting effort". LRS supports any kind of wild forking and experimentation, we believe the exploration of many directions to be necessary in order to find the right one, in a good society waste of work won't be happening -- that's an issue of a competitive society, not forking.

In fact we think that (at least soft) forking should be incorporated on a much more basic level, in the way that the suckless community popularized. In suckless everyone's copy of software is a personal fork, i.e. software is distributed in source form and is so extremely easy to compile and modify that every user is supposed to do this as part of the installation process (even if he isn't a programmer). Before compilation user applies his own selected patches, custom changes and specific configuration (which is done in the source code itself) that are unique to that user and which form source code that is the user's personal fork. Some of these personal forks may even become popular and copied by other users, leading to further development of these forks and possible natural rise of very different software. This should lead to natural selection, survival and development of the good and useful forks.


formal_language

Formal Language

The field of formal languages tries to mathematically and rigorously view problems as languages; this includes probably most structures we can think of, from human languages and computer languages to visual patterns and other highly abstract structures. Formal languages are at the root of theoretical computer science and are important e.g. for the theory of computability/decidability, computational complexity, security and compilers, but they also find use in linguistics and other fields of science.

A formal language is defined as a (potentially infinite) set of strings (which are finite but unlimited in length) over some alphabet (which is finite). I.e. a language is a subset of E* where E is a finite alphabet (a set of letters). (* is a Kleene Star and signifies a set of all possible strings over E). The string belonging to a language may be referred to as a word or perhaps even sentence, but this word/sentence is actually a whole kind of text written in the language, if we think of it in terms of our natural languages. The C programming language can be seen as a formal language which is a set of all strings that are a valid C program that compiles without errors etc.

For example, given an alphabet [a,b,c], a possible formal language over it is [a,ab,bc,c]. Another, different possible language over this alphabet is an infinite language [b,ab,aab,aaab,aaaab,...] which we can also write with a regular expression as a*b. We can also see e.g. English as being a formal language equivalent to a set of all texts over the English alphabet (along with symbols like space, dot, comma etc.) that we would consider to be in English as we speak it.

What is this all good for? This mathematical formalization allows us to classify languages and understand their structure, which is necessary e.g. for creating efficient compilers, but also to understand computers as such, their power and limits, as computers can be viewed as machines for processing formal languages. With these tools researches are able to come up with proofs of different properties of languages, which we can exploit. For example, within formal languages, it has been proven that certain languages are uncomputable, i.e. there are some problems which a computer cannot ever solve (typical example is the halting problem) and so we don't have to waste time on trying to create such algorithms as we will never find any. The knowledge of formal languages can also guide us in designing computer languages: e.g. we know that regular languages are extremely simple to implement and so, if we can, we should prefer our languages to be regular.

Classification

We usually classify formal languages according to the Chomsky hierarchy, by their computational "difficulty". Each level of the hierarchy has associated models of computation (grammars, automatons, ...) that are able to compute all languages of that level (remember that a level of the hierarchy is a superset of the levels below it and so also includes all the "simpler" languages). The hierarchy is more or less as follows:

Note that here we are basically always examining infinite languages as finite languages are trivial. If a language is finite (i.e. the set of all strings of the language is finite), it can automatically be computed by any type 3 computational model. In real life computers are actually always equivalent to a finite state automaton, i.e. the weakest computational type (because a computer memory is always finite and so there is always a finite number of states a computer can be in). However this doesn't mean there is no point in studying infinite languages, of course, as we're still interested in the structure, computational methods and approximating the infinite models of computation.

NOTE: When trying to classify a programming language, we have to be careful about what we classify: one thing is what a program written in given language can compute, and another thing is the language's syntax. To the former all strict general-purpose programming languages such as C or JavaScript are type 0 (Turing complete). From the syntax point of view it's a bit more complicated and we need to further define what exactly a syntax is (where is the line between syntax and semantic errors): it may be (and often is) that syntactically the class will be lower. There is actually a famous meme about Perl syntax being undecidable.


forth

Forth

Forth is a based minimalist stack-based untyped programming language with postfix (reverse Polish) notation.

{ It's kinda like usable brainfuck. ~drummyfish }

It is usually presented as interpreted language but may as well be compiled, in fact it maps pretty nicely to assembly.

There are several Forth standard, most notably ANSI Forth from 1994.

A free interpreter is e.g. GNU Forth (gforth).

Language

The language is case-insensitive.

The language operates on an evaluation stack: e.g. the operation + takes the two values at the top of the stack, adds them together and pushed the result back to the stack. Besides this there are also some "advanced" features like variables living outside the stack, if you want to use them.

The stack is composed of cells: the size and internal representation of the cell is implementation defined. There are no data types, or rather everything is just of type signed int.

Basic abstraction of Forth is so called word: a word is simply a string without spaces like abc or 1mm#3. A word represents some operation on stack (and possible other effect such as printing to the console), for example the word 1 adds the number 1 on top of the stack, the word + performs the addition on top of the stack etc. The programmer can define his own words which can be seen as "functions" or rather procedures or macros (words don't return anything or take any arguments, they all just invoke some operations on the stack). A word is defined like this:

: myword operation1 operation2 ... ;

For example a word that computes and average of the two values on top of the stack can be defined as:

: average + 2 / ;

Built-in words include:

GENERAL:

+           add                 a b -> (a + b)
-           subtract            a b -> (b - a)
*           multiply            a b -> (a * b)
/           divide              a b -> (b / a)
=           equals              a b -> (-1 if a = b else 0)
<           less than           a b -> (-1 if a < b else 0)
>           greater than        a b -> (-1 if a > b else 0)
mod         modulo              a b -> (b % a)
dup         duplicate             a -> a a
drop        pop stack top         a ->
swap        swap items          a b -> b a
rot         rotate 3          a b c -> b c a
.           print top & pop
key         read char on top
.s          print stack
emit        print char & pop
cr          print newline
cells       times cell width      a -> (a * cell width in bytes)
depth       pop all & get d.  a ... -> (previous stack size)
bye         quit

VARIABLES/CONSTS:

variable X      creates var named X (X is a word that pushed its addr)
N X !           stores value N to variable X
N X +!          adds value N to variable X
X @             pushes value of variable X to stack
N constant C    creates constant C with value N
C               pushes the value of constant C

SPECIAL:

( )                   comment (inline)
\                     comment (until newline)
." S "                print string S
X if C then           if X, execute C // only in word def.
X if C1 else C2 then  if X, execute C1 else C2 // only in word def.
do C loop             loops from stack top value to stack second from,
                      top, special word "i" will hold the iteration val.
begin C until         like do/loop but keeps looping as long as top = 0
begin C while         like begin/until but loops as long as top != 0
allot                 allocates memory, can be used for arrays

example programs:

100 1 2 + 7 * / . \ computes and prints 100 / ((1 + 2) * 7)
cr ." hey bitch " cr \ prints: hey bitch
: myloop 5 0 do i . loop ; myloop \ prints 0 1 2 3 4

foss

FOSS

FOSS (Free and Open Source Software, sometimes also FLOSS, adding Libre), is a kind of neutral term for software that is both free as in freedom and open source. It's just another term for this kind of software, as if there weren't enough of them :) People normally use this to stay neutral, to appeal to both free and open source camps or if they simply need a short term not requiring much typing.


fqa

Frequently Questioned Answers

TODO: figure out what to write here


fractal

Fractal

Informally speaking fractal is a shape that's geometrically "infinitely complex" while being described in an extremely simple way, e.g. with a very simple formula or algorithm. Shapes found in the nature, such as trees, mountains or clouds, are often fractals. Fractals show self-similarity, i.e. when "zooming" into an ideal fractal we keep seeing it is composed, down to an infinitely small scale, of shapes that are similar to the shape of the whole fractal; e.g. the branches of a tree look like smaller versions of the whole tree etc.

Fractals are the beauty of mathematics, they can impress even complete non-mathematicians and so are probably good as a motivational example in math education.

Fractal is formed by iteratively or recursively (repeatedly) applying its defining rule -- once we repeat the rule infinitely many times, we've got a perfect fractal. In the real world, of course, both in nature and in computing, the rule is just repeat many times as we can't repeat literally infinitely. The following is an example of how iteration of a rule creates a simple tree fractal; the rule being: from each branch grow two smaller branches.

                                                    V   V V   V
                                \ /   \ /         V  \ /   \ /  V
               |     |      _|   |     |   |_   >_|   |     |   |_<
            '-.|     |.-'     '-.|     |.-'        '-.|     |.-'
   \   /        \   /             \   /                \   /
    \ /          \ /               \ /                  \ /
     |            |                 |                    |
     |            |                 |                    |
     |            |                 |                    |

iteration 0  iteration 1       iteration 2          iteration 3

Mathematically fractal is a shape whose Hausdorff dimension (the "scaling factor of the shape's mass") is non-integer. For example the Sierpinski triangle can normally be seen as a 1D or 2D shape, but its Hausdorff dimension is approx. 1.585 as if we scale it down twice, it decreases its "weight" three times (it becomes one of the three parts it is composed of); Hausdorff dimension is then calculated as log(3)/log(2) ~= 1.585.

L-systems are one possible way of creating fractals. They describe rules in form of a formal grammar which is used to generate a string of symbols that are subsequently interpreted as drawing commands (e.g. with turtle graphics) that render the fractal. The above shown tree can be described by an L-system. Among similar famous fractals are the Koch snowflake and Sierpinski Triangle.

              /\
             /\/\
            /\  /\
           /\/\/\/\
          /\      /\
         /\/\    /\/\
        /\  /\  /\  /\
       /\/\/\/\/\/\/\/\
       
     Sierpinski Triangle

Fractals don't have to be deterministic, sometimes there can be randomness in the rules which will make the shape be not perfectly self-similar (e.g. in the above shown tree fractal we might modify the rule to from each branch grow 2 or 3 new branches).

Another way of describing fractals is by iterative mathematical formulas that work with points in space. One of the most famous fractals formed this way is the Mandelbrot set. It is the set of complex numbers c such that the series z_next = (z_previous)^2 + c, z0 = 0 does not diverge to infinity. Mendelbrot set can nicely be rendered by assigning each iteration's result a different color; this produces a nice colorful fractal. Julia sets are very similar and there is infinitely many of them (each Julia set is formed like the Mandelbrot set but c is fixed for the specific set and z0 is the tested point in the complex plain).

Fractals can of course also exist in 3 and more dimensions so we can have also have animated 3D fractals etc.

Fractals In Tech

Computers are good for exploring and rendering fractals as they can repeat given rule millions of times in a very short time. Programming fractals is quite easy thanks to their simple rules, yet this can highly impress noobs.

However, as shown by Code Parade (https://yewtu.be/watch?v=Pv26QAOcb6Q), complex fractals could be rendered even before the computer era using just a projector and camera that feeds back the picture to the camera. This is pretty neat, though it seems no one actually did it back then.

3D fractals can be rendered with ray marching and so called distance estimation. This works similarly to classic ray tracing but the rays are traced iteratively: we step along the ray and at each step use an estimate of the current point to the surface of the fractal; once we are "close enough" (below some specified threshold), we declare a hit and proceed as in normal ray tracing (we can render shadows, apply materials etc.). The distance estimate is done by some clever math.

Mandelbulber is a free, advanced software for exploring and rendering 3D fractals using the mentioned method.

Marble Racer is a FOSS game in which the player races a glass ball through levels that are animated 3D fractals. It also uses the distance estimation method implemented as a GPU shader and runs in real-time.

Fractals are also immensely useful in procedural generation, they can help generate complex art much faster than human artists, and such art can only take a very small amount of storage.

There also exist such things as fractal antennas and fractal transistors.


frameless

Frameless Rendering

Frameless rendering is a technique of rendering animation by continuously updating an image on the screen by updating single "randomly" selected pixels rather than by showing a quick sequence of discrete frames. This is an alternative to the mainstream double buffered frame-based rendering traditionally used nowadays.

Typically this is done with image order rendering methods, i.e. methods that can immediately and independently compute the final color of any pixel on the screen -- for example with raytracing.

The main advantage of frameless rendering is of course saving a huge amount of memory usually needed for double buffering, and usually also increased performance (fewer pixels are processed per second). The animation may also seem more smooth and responsive -- reaction to input is seen faster. Another advantage, and possibly a disadvantage as well, is a motion blur effect that arises as a side effect of updating by individual pixels spread over the screen: some pixels show the scene at a newer time than others, so the previous images kind of blend with the newer ones. This may add realism and also prevent temporal aliasing, but blur may sometimes be undesirable, and also the kind of blur we get is "pixelated" and noisy.

Selecting the pixels to update can be done in many ways, usually with some pseudorandom selection (jittered sampling, Halton sequence, Poisson Disk sampling, ...), but regular patterns may also be used. There have been papers that implemented adaptive frameless rendering that detected where it is best to update pixels to achieve low noise.

Historically similar (though different) techniques were used on computers that didn't have enough memory for a double buffer or redrawing the whole screen each frame was too intensive on the CPU; programmers had to identify which pixels had to be redrawn and only update those. This resulted in techniques like adaptive tile refresh used in scrolling games such as Commander Keen.


framework

Framework

Software framework is a collection of tools such as environments, libraries, compilers and editors, that together allow fast and comfortable implementation of other software by plugging in relatively small pieces of code. While a simple library is something that's plugged as a helper into programmer's code, framework is a bigger system into which programmer plugs his code. Frameworks are generally bloated and harmful, LRS doesn't recommend relying on them.


free_body

Free Body

Free (as in freedom) body, also libre body, is a body of a human who allows (legally and otherwise) everyone some basic rights to it, such as the right to touch any of its part. This is a concept inspired by that of free software and free culture, just applied to one's body.

TODO: waiver like CC0 but for a body?

{ Made my waiver here https://codeberg.org/drummyfish/my_text_data/src/branch/master/body_waiver.txt. ~drummyfish }

See Also


free_culture

Free Culture

Free (as in freedom) culture is a movement aiming for the relaxation of intellectual property restrictions, mainly that of copyright, to allow free usage, reusing and sharing of artworks and other kind of information. Free culture argues that our society has gone too far in forcefully restricting the natural freedom of information by very strict laws (e.g. by authors holding copyright even 100 years after their death) and that we're hurting art, creativity, education and progress by continuing to strengthen restrictions on using, modifying (remixing) and sharing things like books, music and scientific papers. The word "free" in free culture refers to freedom, not just price -- free cultural works have to be more than just available gratis, they must also give its users some specific legal rights. Nevertheless free culture itself isn't against commercialization of art, it just argues for rather doing so by other means than selling legal rights to it. The opposite of free culture is permission culture (culture requiring permission for reuse of intellectual works).

The promoters of free culture want to relax intellectual property laws (copyright, patents, trademarks etc.) but also promote an ethic of sharing and remixing being good (as opposed to the demonizing anti-"piracy" propaganda of today), they sometimes mark their works with words "some rights reserved" or even "no rights reserved", as opposed to the traditional "all rights reserved".

Free culture is kind of a younger sister movement to the free software movement, in fact it has been inspired by it (we could call it its fork). While free software movement, established in 1983, was only concerned with freedoms relating to computer program source code, free culture later (around 2000) took its ideas and extended them to all information including e.g. artworks and scientific data. There are clearly defined criteria for a work to be considered free (as in freedom) work, i.e. part of the body of free cultural works. The criteria are very similar to those of free software (the definition is at https://freedomdefined.org/Definition) and can be summed up as follows:

A free cultural work must allow anyone to (legally and practically):

  1. Use it in any way and for any purpose, even commercially.
  2. Study it.
  3. Share it, i.e. redistribute copies, even commercially.
  4. Modify it and redistribute the modified copies, even commercially.

Some of these conditions may e.g. further require a source code of the work to be made available (e.g. sheet music, to allow studying and modification). Some conditions may however still be imposed, as long as they don't violate the above -- e.g. if a work allows all the above but requires crediting the author, it is still considered free (as in freedom). Copyleft (also share-alike, requirement of keeping the license for derivative works) is another condition that may be required. This means that many (probably most) free culture promoters actually rely and even support the concept of e.g. copyright, they just want to make it much less strict.

It was in 2001 when Lawrence Lessig, an American lawyer who can be seen as the movement's founder, created the Creative Commons, a non-profit organization which stands among the foundations of the movement and is very much connected to it. By this time he was already educating people about the twisted intellectual property laws and had a few followers. Creative Commons would create and publish a set of licenses that anyone could use to release their works under much less restrictive conditions than those that lawfully arise by default. For example if someone creates a song and releases it under the CC-BY license, he allows anyone to freely use, modify and share the song as long as proper attribution is given to him. It has to be noted that NOT all Creative Commons licenses are free culture (those with NC and ND conditions break the above given rules)! It is also possible to use other, non Creative Commons licenses in free culture, as long as the above given criteria are respected.

In 2004 Lessig published his book called Free Culture that summarized the topic as well as proposed solutions -- the book itself is shared under a Creative Commons license and can be downloaded for free (however the license is among the non-free CC licenses so the book itself is not part of free culture lmao, big fail by Lessig).

{ I'd recommend reading the Free Culture book to anyone whose interests lie close to free culture/software, it's definitely one of the essential works. ~drummyfish }

In the book Lessig gives an overview of the history of copyright -- it has been around since about the time of invention of printing press to give some publishers exclusive rights (an artificial monopoly) for printing and publishing certain books. The laws evolved but at first were not so restrictive, they only applied to very specific uses (printing) and for limited time, plus the copyright had to be registered. Over time corporations pressured to make it more and more restrictive -- nowadays copyright applies to basically everything and lasts for 70 years AFTER the death of the author (!!!). This is combined with the fact that in the age of computers any use of information requires making a copy (to read something you need to download it), i.e. copyright basically applies to ANY use now. I.e. both scope and term of copyright have been extended to the extreme, and this was done even AGAINST the US constitution -- Lessig himself tried to fight against it in court but lost. This form of copyright now restricts culture and basically only serves corporations who want to e.g. kill the public domain (works that run out of copyright and are now "free for everyone") by repeatedly prolonging the copyright term so that people don't have any pool of free works that would compete (and often win simply by being gratis) with the corporate created "content". In the books Lessig also mentions many hard punishments for breaking copyright laws and a lot of other examples of corruption of the system. He then goes on to propose solutions, mainly his Creative Commons licenses.

Free culture has become a relative success, the free Creative Commons licenses are now widely used -- Wikipedia is one of the most famous examples of free culture as it is licensed under the CC-BY-SA and its sister project Wikimedia Commons hosts over 80 million free cultural works! Openstreetmap is a free cultural collaborative project offering maps of the whole world, libregamewiki and opengameart are sites focused on creation of free cultural video games and game assets and there are many more. There are famous promoters of free culture such as Nina Paley, there exist webcomics, books, songs etc. In development of libre games free cultural licenses are used (alongside free software licenses) to liberate the game assets -- e.g. the Freedoom project creates free culture content replacement for the game Doom. Many scientists release their data to public domain under CC0. And of course, LRS highly advocated free culture, specifically public domain under CC0.

BEWARE of fake free culture: there are many resources that look like or even call themselves "free culture" despite not adhering to its rules. This may be by intention or not, some people just don't know too much about the topic -- a common mistake is to think that all Creative Commons licenses are free culture -- again, this is NOT the case (the NC and ND ones are not). Some think that "free" just means "gratis" -- this is not the case (free means freedom, i.e. respecting the above mentioned criteria of free cultural works). Many people don't know the rules of copyright and think that they can e.g. create a remix of some non-free pop song and license it under CC-BY-SA -- they CANNOT, they are making a derivative work of a non-free work and so cannot license it. Some people use licenses without knowing what they mean, e.g. many use CC0 and then ask for their work to not be used commercially -- this can't be done, CC0 specifically allows any commercial use. Some try to make their own "licenses" by e.g. stating "do whatever you want with my work" instead of using a proper waiver like CC0 -- this is with high probability legally unsafe and invalid, it is unfortunately not so easy to waive one's copyright -- DO use the existing licenses. Educate yourself and if you're unsure, ask away in the community, people are glad to give advice.

See Also


freedom

Freedom

Only when man loses everything he becomes actually free. Freedom is about letting go.

TODO


free_hardware

Free/Freedom-Friendly Hardware

Free (as in freedom) hardware is a form of ethical hardware aligned with the philosophy of free (as in freedom) software, i.e. having a free licensed designed that allows anyone to study, use, modify and share such designs for any purpose and so prevent abuse of users by technology. Let us note the word free refers to user freedom, not price! Sometimes the term may be more broadly and not completely correctly used even for hardware that's just highly compatible with purely free software systems -- let us rather call these a freedom friendly hardware -- and sometimes people misunderstand the term free as meaning "gratis hardware"; to avoid misunderstandings GNU recommends using the term free design hardware or libre hardware for free hardware in the strict sense, i.e. hardware with free licensed design. Sometimes -- nowadays maybe even more often -- the term "open source" hardware or open hardware with very similar meaning is encountered, but that is of course a harmful terminology as open source is an inherently harmful capitalist movement ignoring the ethical question of freedom -- hence it is recommended to prefer using the term free hardware. Sometimes the acronym FOSH (free and open source hardware) is used neutrally, similarly to FOSS.

GNU, just like us, highly advocates for free hardware, though, unlike with software, they don't completely reject using non-free hardware nowadays, not just for practical reasons (purely free hardware basically doesn't exist), but also because hardware is fundamentally different from software and it is possible to use some non-free hardware (usually the older one) relatively safely, without sacrificing freedom. The FSF issues so called Respects Your Freedom (RYF) certification for non-malicious hardware products, both free and non-free, that can be used with 100% free software (even though RYF has also been a target of some criticism of free software activists).

We, LRS, advocate for more strict criteria than just a free-licensed hardware design, for example we prefer complete public domain and advocate high simplicity which is a prerequisite of true freedom -- see less retarded hardware for more.

The topic of free hardware is a bit messy, free hardware definition is not as straightforward as that of free software because hardware, a physical thing, has some inherently different properties than software and it is also not as easy to design and create so it evolves more slowly than software and it is much more difficult to create hardware completely from the ground up. Now consider the very question "what even is hardware"? There is a grey area between hardware and software, sometimes we see firmware as hardware, sometimes as software, sometimes pure software can be hardwired into a circuit so it basically behaves like hardware etc. Hardware design also has different levels, a higher level design may be free-licensed but its physical implementation may require existing lower level components that are non-free -- does such hardware count as free or not? How much down does free go -- do peripherals have to be free? Do the chips have to be free? Do the transistors themselves have to be free? We have to keep these things in mind. While in the software world it is usually quite easy to label a piece of software as free or not (at least legally), with hardware we rather tend to speak of different levels of freedom, at least for now.

Existing Free And Freedom-Friendly Hardware And Firmware

{ I'm not so much into hardware, this may be incomplete or have some huge errors, as always double check and please forgive :) Report any errors you find, also send me suggestions, thanks. ~drummyfish }

TODO, WORK IN PROGRESS, UNDER CONSTRUCTION

The following is a list of hardware whose design is at least to some degree free/open (i.e. for example free designs that however may be using a non-free CPU, this is an issue discussed above):

The following is a list of some "freedom friendly" hardware, i.e. hardware that though partly or fully proprietary is not or can be made non-malicious to the user (has documented behavior, allows fully free software, librebooting, battery replacement, repairs etc.):

The following is a list of firmware, operating systems and software tools that can be used to liberate freedom-friendly proprietary devices:

See Also


free

Free

In our community, as well as in the wider tech and some non-tech communities, the word free is normally used in the sense of free as in freedom, i.e. implying freedom, not price. The word for "free of cost" is gratis (also free as in beer). To prevent this confusion the word libre is sometimes used in place of free, or we say free as in freedom, free as in speech etc.


free_software

Free Software

Not to be confused with open $ource.

Free (as in freedom) software is a type of ethical software that's respecting its users' freedom and preventing their abuse, generally by availability of its source code AND by a license that allows anyone to use, study, modify and share the software without restricting conditions (such as having to pay or get explicit permission from the author). Free software is NOT equal to software whose source code is just available publicly or software that is offered for zero price, the basic legal rights to the software are the key attribute that has to be present. Free software stands opposed to proprietary software -- the kind of abusive, closed software that capitalism produces by default. Free software is not to be confused with freeware ("gratis", software available for free); although free software is always available for free thanks to its definition, zero price is not its goal. The goal is freedom.

Free software is also known as free as in freedom, free as in speech software or libre software. It is sometimes equated with open source, even though open source is fundamentally different (evil), or neutrally labelled FOSS or FLOSS (free/libre and open-source software); sadly free software has lost to open source in mainstream popularity. In contrast to free software, software that is merely gratis (freeware) is sometimes called free as in beer.

Examples of free software include the GNU operating system (also known as "Linux"), GIMP (image editor), Stockfish chess engine, or games such as Xonotic and Anarch. Free software is actually what runs the world, it is a standard among experts and it is possible to do computing with exclusively free software, even though most normal people don't even know the term free software exists because they only ever come in contact with abusive proprietary consumer software such as Windows and capitalist games. There also exists a lot of big and successful software, such as Fireforx, Linux (the kernel) or Blender, that's often spoken of as free software which may however be only technically true or true only to a big (but not full) degree: for example even though Linux is 99% free, in its vanilla version it comes with proprietary binary blobs which breaks the rules of free software. Blender is technically free but it is also capitalist software which doesn't really care about freedom and may de-facto limit some freedoms required by free software, even if they are granted legally by Blender's license. Such software is better called "open source" or FOSS because it doesn't meet the high standards of free software. This issue of technically-but-not-really free software is addressed by some newer movements and philosophies such as suckless and our less retarded software who usually also aim for unbloating technology so as to make it more free in practice.

Though unknown to common people, the invention and adoption of free software has been one the most important events in the history of computers -- mere technology consumers nowadays don't even realize (and aren't told) that what they're using consists and has been enabled possibly mostly by software written non-commercially, by volunteers for free, basically on communist principles. Even if consumer technology is unethical because the underlying free technology has been modified by corporations to abuse the users, without free software the situation would have been incomparably worse if Richard Stallman hadn't achieved the small miracle of establishing the free software movement. Without it there would probably be practically no alternative to abusive technology nowadays, everything would be much more closed, there would probably be no "open source", "open hardware" such as Arduino and things such as Wikipedia. If the danger of intellectual property in software wasn't foreseen and countered by Richard Stallman right in the beginning, the corporations' push of legislation would probably have continued and copyright laws might have been many times worse today, to the point of not even being able to legally write free software nowadays. We have to be very grateful that this happened and continue to support free software.

Richard Stallman, the inventor of the concept and the term "free software", says free software is about ensuring the freedom of computer users, i.e. people truly owning their tools -- he points out that unless people have complete control over their tools, they don't truly own them and will instead become controlled and abused by the makers (true owners) of those tools, which in capitalism are corporations. Richard Stallman stressed that there is no such thing as partially free software -- it takes only a single line of code to take away the user's freedom and therefore if software is to be free, it has to be free as a whole. This is in direct contrast with open source (a term discourages by Stallman himself) which happily tolerates for example Windows only programs and accepts them as "open source", even though such a program cannot be run without the underlying proprietary code of the platform. It is therefore important to support free software rather than the business spoiled open source.

Free software is not about privacy! That is a retarded simplification spread by cryptofascists. Free software, as its name suggests, is about freedom in wide sense, which of course may include the freedom to stay anonymous, but there are many more freedoms which free software stands for, e.g. the freedom of customization of one's tools or the general freedom of art -- being able to utilize or remix someone else's creation for creating something new or better. Software focused on privacy is called simply privacy respecting software.

Is free software communism? This is a question often debated by Americans who have a panic phobia of anything resembling ideas of sharing and giving away for free. The answer is: yes and no. No as in it's not Marxism, the kind of evil pseudocommunism that plagued the world not a long time long ago -- that was a hugely complex, twisted violent ideology encompassing whole society which furthermore betrayed many basic ideas of equality and so on. Compared to this free software is just a simple idea of not applying intellectual property to software, and this idea may well function under some form of early capitalism. But on the other hand yes, free software is communism in its general form that simply states that sharing is good, it is communism as much e.g. teaching a kid to share toys with its siblings.

Definition

Free software was originally defined by Richard Stallman for his GNU project. The definition was subsequently adopted and adjusted by other groups such as Debian and so nowadays there isn't just one definition, even though the GNU definition is usually implicitly supposed. However, all of these definition are very similar and are basically variations and subsets of the original one. The GNU definition of free software is paraphrased as follows:

Software is considered free if all its users have the legal and de facto rights to:

  1. Use the software for any purpose (even commercial or that somehow deemed unethical by someone).
  2. Study the software. For this source code of the program has to be available.
  3. Share the software with anyone.
  4. Modify the software. For this source code of the program has to be available. This modified version can also be shared with anyone.

Note that as free software cares about real freedom, the word "right" here is seen as meaning a de facto right, i.e. NOT just a legal right -- legal rights (a free license) are required but if there appears a non-legal obstacle to those freedoms, free software communities will address them. Again, open source differs here by just focusing on legality.

To make it clear, freedom 0 (use for any purpose) covers ANY use, even commercial use or use deemed unethical by society of the software creator. Some people try to restrict this freedom, e.g. by prohibiting use for military purposes or prohibiting use by "fascists", which makes the software NOT free anymore. NEVER DO THIS. The reasoning behind freedom 0 is the same as that behind free speech: allowing any use doesn't imply endorsing or supporting any use, it simply means that we refuse to engage in certain kinds of oppression our of principle. Trying to mess with freedom 0 would be similar to e.g. prohibiting science on the ground of the fact that scientific results can be used in unethical ways -- we simply don't do this. We try to prevent unethical behavior in other ways than prohibiting basic rights.

Source code here means the preferred form in which software is modified, i.e. things such as obfuscated source code don't count as true source code.

The developers of Debian operating system have created their own guidelines (Debian Free Software Guidelines) which respect these points but are worded in more complex terms and further require e.g. non-functional data to be available under free terms as well (source) which GNU doesn't (source). The definition of open source is yet more complex even though in practice legally free software is eventually also open source and vice versa.

History

Free software was invented by Richard Stallman in the 1980s. His free software movement inspired later movements such as the free culture movement and the evil open-source movement.

See Also


free_speech

Free Speech

Freedom of speech means there are no arbitrary government or anyone else imposed punishments for or obstacles (such as censorship) to merely talking about anything, making any public statement or publication of any information. Free speech has to be by definition absolute and have no limit, otherwise it's not free speech but controlled speech -- trying to add exceptions to free speech is like trying to limit to whom a free software license is granted; doing so immediately makes such software non-free. Free speech also comes with zero responsibility exactly by definition, as responsibility implies some forms of punishment; free speech means exactly one can say anything without fearing any burden of responsibility. Freedom of speech is an essential attribute of a mature society, sadly it hasn't been widely implemented yet and with the SJW cancer the latest trend in society is towards eliminating free speech rather than supporting it (see e.g. political correctness). Speech is being widely censored by extremist groups (e.g. LGBT and corporations, see also cancel culture) and states -- depending on country there exist laws against so called "hate speech", questioning official versions of history (see e.g. Holocaust denial laws present in many EU states), criticizing powerful people (for example it is illegal to criticize or insult that huge inbred dick Thai king), sharing of useful information such as books (copyright censorship) etc. Free speech nowadays is being eliminated by the strategy of creating an exception to free speech, usually called "hate speech", and then classifying any undesired speech under such label and silencing it.

The basic principle of free speech says that if you don't support freedom of speech which you dislike, you don't support free speech. I.e. speech that you hate does not equal hate speech.

Some idiots (like that xkcd #1357) say that free speech is only about legality, i.e. about what's merely allowed to be said by the law or what speech the law "protects". Of course, this is completely wrong and just reflects this society's obsession with law; true free speech mustn't be limited by anything -- if you're not allowed to say something, it doesn't matter too much what it is that's preventing you, your speech is not free. If for example it is theoretically legal to be politically incorrect and criticize the LGBT gospel but you de-facto can't do it because the LGBT fascist SJWs would cancel you and maybe even physically lynch you, your speech is not free. It is important to realize we mustn't tie free speech to legal definition, i.e. it isn't enough to make speech free only in legal sense -- keep in mind that a good society aims to eliminating law itself. Our goal is to make speech free culturally, i.e. teach people that we should let others speak freely, even those -- and especially those -- who we disagree with.

Free speech extends even to such actions as shouting "fire" in a crowded theatre. In a good society with free speech people don't behave like monkeys, they will not trust a mere shout without having a further proof of there actually being fire and even if they know there is fire, they will not panic as that's a retarded things to do.

Despite what the propaganda says there is no free speech in our society, the only kind of speech that is allowed is that which either has no effect or which the system desires for its benefit. Illusion of free speech is sustained by letting people speak until they actually start making a change -- once someone's speech leads to e.g. revealing state secrets or historical truths (e.g. about Holocaust, human races or government crimes -- see wikileaks) or to destabilizing economy or state, such speech is labeled "harmful" in some way (hate speech, intellectual property violation, revealing of confidential information, instigating crime, defamation etc.), censored and punished. Even though nowadays just pure censorship laws are being passed on daily basis, even in times when there are seemingly no specific censorship laws and so it seems that "we have free speech" there always exist generic laws that can be fit to any speech, such as those against "inciting violence", "terrorism", "undermining state interests", "hate speech" or any other fancy issue, which can be used to censor absolutely any speech the government pleases, even if such speech has nothing to do with said causes -- it is enough that some state lawyer can find however unlikely possible indirect link to such cause: this could of course be well seen e.g. in the cases of Covid flu or Russia-Ukraine war. Even though there were e.g. no specific laws in European countries against supporting Russia immediately after the war started, government immediately started censoring and locking up people who supported Russia on the Internet, based on the above mentioned generic laws. These laws work on the same principle as backdoor in software: they are advocated as a "safety" "feature" and allow complete takeover of the system, but are mostly unused until the right time comes, to give the users a sense of being safe ("I've been using this backdoored CPU for years and nothing happened, so it's safe"); unlike with software backdoor though the law backdoor isn't usually removed after it has been exploited, people are just too stupid to notice this and governments can get away with keeping the laws in place, so they do.

See Also


free_universe

Free Universe

Free universe (also "open" universe) is a free culture ("free as in freedom") fictional universe that serves as a basis/platform for creating art works such as stories in forms of books, movies or video games. Such a universe provides a consistent description of a fictional world which may include its history and lore, geography, characters, laws of physics, languages, themes and art directions, and possibly also assets such as concept art, maps, music, even ready-to-use 3D video game models etc. A free universe is essentially the same kind of framework which is provided by proprietary universes such as those of Start Wars or Pokemon, with the exception that free universe is free/"open", i.e. it comes with a free license and so allows anyone to use it in any way without needing explicit permission; i.e. anyone can set own stories in the universe, expand on it, fork it, use its characters etc. (possibly under conditions that don't break the rules of free culture). The best kind of free universe is a completely public domain one which imposes absolutely no conditions on its use. The act of creating fictional universes is called world building.

But if anyone is allowed to do anything with the universe and so possibly incompatible works may be created, then what is canon?! Well, anything you want -- it's the same as with proprietary universes, regardless of official canon there may be different groups of fans that disagree about what is canon and there may be works that contradict someone's canon, there is no issue here.

Existing free universes: existence of a serious project aiming for the creation of a free universe as its main goal is unknown to us, however free universes may be spawned as a byproduct of other free works -- for example old public domain books of fiction, such as Flatland, or libre games such as FLARE, Anarch or FreeDink create a free universe. The MMORPG game Ryzom releases its lore and content under a free license, spawning a huge, rich and high quality fantasy universe (though the game's server isn't libre and hence the game as such isn't altogether free). Libre comics such as Pepper and Carrot (https://www.peppercarrot.com/en/wiki), Wuffle Comics (https://web.archive.org/web/20200117033753/http://www.wufflecomics.com/) and Phil from GCHQ (https://phillfromgchq.co.uk/) also give life to their own free universes. If you want to start a free universe project, go for it, it would be highly valued!

See Also


free_will

Free Will

You can do what you want, but you can't want what you want.

Free will is a logically erroneous egocentric belief that humans (and possibly other living beings) are special in the universe by possessing some kind of soul which may disobey laws of physics and somehow make spontaneous, unpredictable decisions according to its "independent" desires. Actually that's the definition of absolute indeterminate free will; weaker definitions are also possible, e.g. volitional free will means just that one's actions are determined internally, or for the purposes of law definitions based on one's sanity may be made. But here we'll focus on the philosophical definition as that's what most autism revolves around. The Internet (and even academic) debates of free will are notoriously retarded to unbelievable levels, similarly to e.g. debates of consciousness.

{ Sabine nicely explains it here https://yewtu.be/watch?v=zpU_e3jh_FY. ~drummyfish }

Free will is usually discussed in relation to determinism, an idea of everything (including human thought and behavior) being completely predetermined from the start of the universe. Determinism is the most natural and most likely explanation for the working of our universe; it states that laws of nature dictate precisely which state will follow from current state and therefore everything that will every happen is only determined by the initial conditions (start of the universe). As human brain is just matter like any other, it is no exception to the laws of nature. Determinism doesn't imply we'll be able to make precise predictions (see e.g. chaos or undecidability), just that everything is basically already set in stone as a kind of unavoidable fate. Basically the only other possible option is that there would be some kind true randomness, i.e. that laws of nature don't specify an exact state to follow from current state but rather multiple states out of which one is "taken" at random -- this is proposed by some quantum physicists as quantum physics seems to be showing the existence of inherent randomness. Nevertheless quantum physics may still be deterministic, see the theory of hidden variables and superdeterminism (no, Bell test didn't disprove determinism). But EVEN IF the universe is non deterministic, free will still CANNOT exist. Therefore this whole debate is meaningless.

Why is there no free will? Because it isn't logically possible, just like e.g. the famous omnipotent God (could he make a toast so hot he wouldn't be able to eat it?). Either the universe is deterministic and your decisions are already predetermined, or there exists an inherent randomness and your decisions are determined by a mere dice roll (which no one can call a free will more than just making every decision in life based on a coin toss). In either case your decisions are made for you by something "external". Even if you follow a basic definition of free will as "acting according to one's desires", you find that your decisions are DETERMINED by your desires, i.e. something you did not choose (your desires) makes decisions for you. There is no way out of this unless you reject logic itself.

For some reason retards (basically everyone) don't want to accept this, as if accepting it changed anything, stupid capitalists think that it would somehow belittle their "achievements" or what? Basically just like the people who used to let go of geocentrism. This is ridiculous, they hold on to the idea of their "PRECIOOOOUUUSS FREE WILL" to the death, then they go and consume whatever a TV tells them to consume. Indeed one of the most retarded things in the universe.


fsf

FSF

FSF stands for Free Software Foundation, a non-profit organization established by Richard Stallman with the goal of promoting and supporting free as in freedom software, software that respects its users' freedom.

History

TODO

In September 2019 Richard Stallman, the founder and president of the FSF, was cyberbullied and cancelled by SJW fascists for simply stating a rational but unpopular opinion on child sexuality and was forced to resign as a president. This might have been the last nail in the coffin for the FSF. The new president would come to be Geoffrey Knauth, an idiot who spent his life writing proprietary software in such shit as C# and helped built military software for killing people (just read his cv online). What's next, a porn actor becoming the next Pope? Would be less surprising.

After this the FSF definitely died.


function

Function

Function is a very basic term in mathematics and programming with a slightly different meanings in each: mathematical function maps numbers to other numbers, a function in programming is a subprogram to which we divide a bigger program. Well, that's pretty simplified but those are the basic ideas. A more detailed explanation will follow.

Mathematical Functions

In mathematics functions can be defined and viewed from different angles, but it is essentially anything that assigns each member of some set A (so called domain) exactly one member of a potentially different set B (so called codomain). A typical example of a function is an equation that from one "input number" computes another number, for example:

f(x) = x / 2

Here we call the function f and say it takes one parameter (the "input number") called x. The "output number" is defined by the right side of the equation, x / 2, i.e. the number output by the function will be half of the parameter (x). The domain of this function (the set of all possible numbers that can be taken as input) is the set of real numbers and the codomain is also the set of real numbers. This equation assigns each real number x another real number x / 2, therefore it is a function.

{ I always imagined functions as kind of little boxes into which we throw a number and another number falls out. ~drummyfish }

Now consider a function f2(x) = 1 - 1 / x. Note that in this case the domain is the set of real numbers minus zero; the function can't take zero as an input because we can't divide by zero. The codomain is the set of real numbers minus one because we can't ever get one as a result.

Another common example of a function is the sine function that we write as sin(x). It can be defined in several ways, commonly e.g. as follows: considering a right triangle with one of its angles equal to x radians, sin(x) is equal to the ratio of the side opposing this angle to the triangle hypotenuse. For example sin(pi / 4) = sin(45 degrees) = 1 / sqrt(2) ~= 0.71. The domain of sine function is again the set of real number but its codomain is only the set of real numbers between -1 and 1 because the ratio of said triangle sides can never be negative or greater than 1, i.e. sine function will never yield a number outside the interval <-1,1>.

Note that these functions have to satisfy a few conditions to really be functions. Firstly each number from the domain must be assigned exactly one number (although this can be "cheated" by e.g. using a set of couples as a codomain), even though multiple input numbers can give the same result number. Also importantly the function result must only depend on the function's parameter, i.e. the function mustn't have any memory or inside state and it mustn't depend on any external factors (such as current time) or use any randomness (such as a dice roll) in its calculation. For a certain argument (input number) a function must give the same result every time. For this reason not everything that transforms numbers to other numbers can be considered a function.

Functions can have multiple parameters, for example:

g(x,y) = (x + y) / 2

The function g computes the average of its two parameters, x and y. Formally we can see this as a function that maps elements from a set of couples of real numbers to the set of real numbers.

Of course function may also work with just whole numbers, also complex numbers, quaternions and theoretically just anything crazy like e.g. the set of animals :) However in these "weird" cases we generally no longer use the word function but rather something like a map. In mathematical terminology we may hear things such as a real function of a complex parameter which means a function that takes a complex number as an input and gives a real number result.

To get better overview of a certain function we may try to represent it graphically, most commonly we make function plots also called graphs. For a function of a single parameter we draw graphs onto a grid where the horizontal axis represents number line of the parameter (input) and the vertical axis represents the result. For example plotting a function f(x) = ((x - 1) / 4)^2 + 0.8 may look like this:


         |f(x)      
        2+     
'.._     |          
    ''--1+.____...--'
___,__,__|__,__,_____x
  -2 -1  |0 1  2
       -1+
         |
       -2+
         |


This is of course done by plotting various points [x,f(x)] and connecting them by a line.

Plotting functions of multiple parameters is more difficult because we need more axes and get to higher dimensions. For functions of 2 parameters we can draw e.g. a heightmap or create a 3D model of the surface which the function defines. 3D functions may in theory be displayed like 2D functions with added time dimension (animated) or as 3D density clouds. For higher dimensions we usually resort to some kind of cross-section or projection to lower dimensions.

Functions can have certain properties such as:

In context of functions we may encounter the term composition which simply means chaining the functions. E.g. the composition of functions f(x) and g(x) is written as (f o g)(x) which is the same as f(g(x)).

Calculus is an important mathematical field that studies changes of continuous functions. It can tell us how quickly functions grow, where they have maximum and minimum values, what's the area under the line in their plot and many other things.

Notable Mathematical Functions

Functions commonly used in mathematics range from the trivial ones (such as the constant functions, f(x) = constant) to things like trigonometric functions (sine, cosine, tangent, ...), factorial, logarithm, logistic sigmoid function, Gaussian function etc. Furthermore some more complex and/or interesting functions are (the term function may be applied liberally here):

Programming Functions

In programming the definition of a function is less strict, even though some languages, namely functional ones, are built around purely mathematical functions -- for distinction we call these strictly mathematical functions pure. In traditional languages functions may or may not be pure, a function here normally means a subprogram which can take parameters and return a value, just as a mathematical function, but it can further break some of the rules of mathematical functions -- for example it may have so called side effects, i.e. performing additional actions besides just returning a number (such as modifying data in memory which can be read by others, printing something to the screen etc.), or use randomness and internal states, i.e. potentially returning different numbers when invoked (called) multiple times with exactly the same arguments. These functions are called impure; in programming a function without an adjective is implicitly expected to be impure. Thanks to allowing side effects these functions don't have to actually return any value, their purpose may be to just invoke some behavior such as writing something to the screen, initializing some hardware etc. The following piece of code demonstrates this in C:

int max(int a, int b, int c) // pure function
{
  return (a > b) ? (a > c ? a : c) : (b > c ? b : c);
}

unsigned int lastPresudorandomValue = 0;

unsigned int pseudoRandom(unsigned int maxValue) // impure function
{
  lastPresudorandomValue = // side effect: working with global variable
    lastPresudorandomValue * 7907 + 7;
    
  return (lastPresudorandomValue >> 2) % (maxValue + 1);
}

In older languages functions were also called procedures or routines. Sometimes there was some distinction between them, e.g. in Pascal functions returned a value while procedures didn't.


fun

Fun

See also lmao.

Fun is a rewarding lighthearted satisfying feeling you get as a result of doing or witnessing something playful.

Things That Are Fun


furry

Furry

UwU

Furriness is a weird mental disorder (dolphi will forgive :D) and fetish that makes people dig and/or identify as human-like furry animals, e.g. cats, foxes or completely made up species. To a big degree it's a sexual identity but these people just pretend they're animals everywhere, they have furry conventions, you see their weird furry talk in issue trackers on programming websites etc. You cannot NOT meet a furry on the Internet.

In the past we might have been wondering whether by 2020 we'd already have cured cancer, whether we'd have cities on Mars and flying cars. Well no, but you can sexually identify as a fox now.

Furries seem to have a very harmful obsession with copyrighting their art, many create their own "fursonas" or "species" and then prohibit others from using them. Stay away.

TODO: links to those furry porn social media etc.

See Also


future_proof

Future-Proof Technology

Future-proof technology is technology that is very likely to stay functional for a very long time with minimal to no maintenance, even considering significant changes in state of technology in society. In a world of relatively complex technology, such as that of computers, this feature is generally pretty hard to achieve; today's consumerist society makes the situation even much worse by focusing on immediate profit without long-term planning and by implementing things such as bloat, intentional introduction of complexity, obscurity, dependencies and planned obsolescence. But with good approach, such as that of LRS, it is very possible to achieve.

A truly good technology is trying to be future-proof because this saves us the great cost of maintenance and reinventing wheels and it gives its users comfort and safety; users of future-proof technology know they can build upon it without fearing it will suddenly break.

Despite the extremely bad situation not all hope is lost. At least in the world of software future-proofing can be achieved by:

See Also


game_engine

Game Engine

Game engine is a software, usually a framework or a library, that serves as a base code for games. Such an engine may be seen as a platform allowing portability and offering preprogrammed functionality often needed in games (3D rendering, physics engine, I/O, networking, AI, audio, scripting, ...) as well as tools used in game development (level editor, shader editor, 3D editor, ...).

A game engine differs from a general multimedia engine/library, such as SDL, by its specific focus on games. It is also different from generic rendering engines such as 3D engines like OpenSceneGraph because games require more than just rendering (audio, AI, physics, ...). While one may use some general purpose technology such as C or SDL for creating a game, using a game engine should make the process easier. However, beware of bloat that plagues most mainstream game engines. LRS advises against use of any frameworks, so try to at worst use a game library. Many game programmers such as Jonathan Blow advocate and practice writing own engines for one's games.

Existing Engines

The following are some notable game engines.


game

Game

Most generally game is a form of play which is restricted by certain rules, the goal of which is mostly fun, challenge and/or competition. A game may have various combinations of mental elements (e.g. competitive mental calculations, ...) and physical elements (based in real life, e.g. football, marble racing, ...); nowadays very popular games are computer games, or video games (also gaymes or vidya, e.g. Anarch, minesweeper, Doom, ...), which are played with the help of a computer. Game is also a mathematical term in game theory which studies games and competition rigorously.

A fun take at the very concept of a game is Nomic, a game in which changing the game rules is part of the game. It leads to all kinds of mindfucks.

Computer Games

Computer game is software whose main purpose is to be played and interactively entertain the user. Sadly most computer games are proprietary and toxic.

Among suckless software proponents there is a disagreement about whether games are legit software or just a meme and harmful kind of entertainment. The proponents of the latter argue something along the lines that technology is only for getting work done, that games are for losers, that they hurt MUH PRODUCTIVITY, are an unhealthy addiction, wasted time and effort etc. Those who like games see them as a legitimate form of relaxation, a form of art that's pleasant both to make and enjoy as a finished piece, and also a way to advancing technology along the way (note we are NOT talking about consumerist games here; any consumerist art is bad). Developing games has historically led to improvements of other kinds of software, mainly e.g. 3D rendering, physics simulation and virtual reality. If games are done well, in a non-capitalist way, then we, LRS, fully accept games as legitimate software; of course as long as their purpose is to help all people, i.e. while we don't reject games as such, we reject most games the industry produces nowadays. We further argue that in games it is acceptable to do what in real life is unethical (even to characters controlled by other live players) and that this is in fact one of their greatest aspect as they allow to satisfy natural needs that were crucial in the jungle but became harmful in advanced society, such as those for competition, violence, fascism, egoistic behavior and others -- provided the player can tell a difference between a game and real life of course. As such, games help us build a better society in which people can satisfy even harmful needs without doing harm; in a game it is acceptable to torture people, roleplay as a capitalist or even verbally bully other players in chat (who joined the server willingly knowing this is just a simulation, a roleplay), even though these things would be unacceptable to do in real life.

Despite arguments about the usefulness of games, most people agree on one thing: that the mainstream AAA games produced by big corporations are harmful, bloated, toxic, badly made and designed to be highly malicious, consumerist products. They are one of the worst cases of capitalist software. Such games are never going to be considered good from our perspective (and even the mainstream is turning towards classifying modern games as shit).

PC games are mostly made for and played on MS Windows which is still the "gaming OS", even though in recent years we've seen a boom of "Linux gaming", possibly thanks to Windows getting shittier and shittier every year. Many normies nowadays are practicing "mobile" or console gayming which may be even worse. However, most games, even when played on GNU/Linux, are still proprietary, capitalist and bloated as hell.

We might call this the great tragedy of games: the industry has become similar to the industry of drug abuse. Games feel great and can become very addictive, especially to people not aware of the dangers (children). Today not playing latest games makes you left out socially, out of the loop, a weirdo. Therefore contrary to the original purpose of a game -- that of making life better and bringing joy -- an individual "on games" from the capitalist industry will crave to constantly consume more and more "experiences" that get progressively more expensive to satisfy. This situation is purposefully engineered by the big game producers who exploit psychological and sociological phenomena to enslave gamers and make them addicted. Games become more and more predatory and abusive and of course, there are no moral limits for corporations of how far they can go: games with microthefts and lootboxes, for example, are similar to gambling, and are often targeted at very young children. The game industry cooperates with the hardware and software industry to together produce a consumerist hell in which one is required to constantly update his hardware and software and to keep spending money just to stay in. The gaming addiction is so strong that even the FOSS people somehow create a mental exception for games and somehow do not mind e.g. proprietary games even though they otherwise reject proprietary software. Even most of the developers of free software games can't mentally separate themselves from the concepts set in place by capitalist games, they try to subconsciously mimic the toxic attributes of such games (bloat, unreasonably realistic graphics and hardware demands, content consumerism, cheating "protection", language filters, ...).

Therefore it is crucial to stress that games are technology like any other, they can be exploiting and abusive, and so indeed all the high standards we hold for other technology we must also hold for games. Too many people judge games solely by their gameplay. For us at LRS gameplay is but one attribute, and not even the one standing at the top; factors such as software freedom, cultural freedom, sucklessness, good internal design and being future proof are even more important.

A small number of games nowadays come with a free engine, which is either official (often retroactively freed by its developer in case of older games) or developed by volunteers. Example of the former are the engines of ID games (Doom, Quake), example of the latter can be OpenMW (a free engine for TES: Morrowind) or Mangos (a free server for World of Warcraft). Console emulators (such as of Playstation or Gameboy) can also be considered a free engine for playing proprietary games.

Yet a smaller number of games are completely free (in the sense of Debian's free software definition), including both the engine and game assets. These games are called free games or libre games and many of them are clones of famous proprietary games. Examples of these probably (one can rarely ever be sure about legal status) include SuperTuxKart, Minetest, Xonotic, FLARE or Anarch. There exists a wiki for libre games at https://libregamewiki.org and a developer forum at https://forum.freegamedev.net/. Libre games can also be found in Debian software repositories. However WATCH OUT, all mentioned repositories may be unreliable!

{ NOTE: Do not blindly trust libregamewiki and freegamedev forum, non-free games ocassionaly DO appear there by accident, negligence or even by intention. I've actually found that most of the big games like SuperTuxKart have some licensing issues (they removed one proprietary mascot from STK after my report). Ryzom has been removed after I brought up the fact that the whole server content is proprietary and secret. So if you're a purist, focus on the simpler games and confirm their freeness yourself. Anyway, LGW is a good place to start looking for libre games. It is much easier to be sure about freedom of suckless/LRS games, e.g. Anarch is legally safe practically with 100% certainty. ~drummyfish }

Some games are pretty based as they don't even require GUI and are only played in the text shell (either using TUI or purely textual I/O) -- these are called TTY games or command line games. This kind of games may be particularly interesting to minimalists, hobbyists and developers with low (zero) budget, little spare time and/or no artistic skills. Roguelike games are especially popular here; there sometimes even exist GUI frontends which is pretty neat -- this demonstrates how the Unix philosophy can be applied to games.

Another kind of cool games are computer implementations of pre-computer games, for example chess, backgammon, go or various card games. Such games are very often well tested and fine-tuned gameplay-wise, popular with active communities and therefore fun, yet simple to program with many existing free implementations and good AIs (e.g. GNU chess, GNU go or Stockfish).

Games As LRS

Games can be suckless and just as any other software should try to adhere to the Unix philosophy. A LRS game should follow all the principles that apply to any other kind of such software, for example being completely public domain or aiming for high portability. This is important to mention because, sadly, many people see games as some kind of exception among software and think that different technological or moral rules apply -- this is wrong.

If you want to make a simple LRS game, there is an official LRS C library for this: SAF.

A LRS game will be similar to any other suckless program, one example of a design choice it should take is the following: while mainstream games are built around the idea of having a precompiled engine that runs scripts written in some interpreted language, a LRS/suckless game wouldn't use run-time scripts but would rather have such "scripts" written as a part of the whole game's source code (e.g. in a file scripts.h), in the same language as the engine (typically C) and they would be compiled into the binary program. This is the same principle by which suckless programs such as dwm don't use config files but rather have the configuration be part of the source code (in a file config.h). Doing this in a suckless program doesn't really have any disadvantages as such program is extremely easy and fast to recompile, and it brings in many advantages such as only using a single language, reducing complexity by not needing any interpreter, not having to open and read script files from the file system and also being faster.

Compared to mainstream games, a LRS game shouldn't be a consumerist product, it should be a tool to help people entertain themselves and relieve their stress. From the user perspective, the game should be focused on the fun and relaxation aspect rather than impressive visuals (i.e. photorealism etc.), i.e. it will likely utilize simple graphics and audio. Another aspect of an LRS game is that the technological part is just as important as how the game behaves on the outside (unlike mainstream games that have ugly, badly designed internals and mostly focus on rapid development and impressing the consumer with visuals).

The paradigm of LRS gamedev differs from the mainstream gamedev just as the Unix philosophy differs from the Window philosophy. While a mainstream game is a monolithic piece of software, designed to allow at best some simple, controlled and limited user modifications, a LRS game is designed with forking, wild hacking, unpredictable abuse and code reuse in mind.

Let's take an example. A LRS game of a real-time 3D RPG genre may for example consist of several independent modules: the RPG library, the game code, the content and the frontend. Yes, a mainstream game will consist of similar modules, however those modules will probably only exist for the internal organization of work and better testing, they won't be intended for real reuse or wild hacking. With the LRS RPG game it is implicitly assumed that someone else may take the 3D game and make it into a purely non-real-time command line game just by replacing the frontend, in which case the rest of the code shouldn't be burdened by anything 3D-rendering related. The paradigm here should be similar to that existing in the world of computer chess where there exist separate engines, graphical frontends, communication protocols, formats, software for running engine tournaments, analyzing games etc. Roguelikes and the world of quake engines show some of this modularity, though not in such a degree we would like to see -- LRS game modules may be completely separate projects and different processes communicating via text interfaces through pipes, just as basic Unix tools do. We have to think about someone possibly taking our singleplayer RPG and make it into an MMORPG. Someone may even take the game and use it as a research tool for machine learning or as a VFX tool for making movies, and the game should be designed so as to make this as easy as possible -- the user interface should be very simple to be replaced by an API for computers. The game should allow easy creation of tool assisted speedruns, to record demos, to allow scripting (i.e. manipulation by external programs, traditional in-game interpreted scripting may be absent, as mentioned previously), modifying ingame variables, even creating cheats etc. And, importantly, the game content is a module as well, i.e. the whole RPG world, its lore and storyline is something that can be modified, forked, remixed, and the game creator should bear this in mind (see also free universe).

Of course, LRS games must NOT contain such shit as "anti-cheating technology", DRM etc. For our stance on cheating, see the article about it.

Types Of Games

Besides dividing games as any other software (free vs proprietary, suckless vs bloat, ...) we may further divide them by the following:

Legal Matters

Thankfully gameplay mechanisms cannot (yet) be copyrighted (however some can sadly be patented) so we can mostly happily clone proprietary games and so free them. However this must be done carefully as there is a possibility of stepping on other mines, for example violating a trade dress (looking too similar visually) or a trade mark (for example you cannot use the word tetris as it's owned by some shitty company) and also said patents (for example the concept of minigames on loading screens used to be patented in the past).

Trademarks have been known to cause problems in the realm of libre games, for example in the case of Nexuiz which had to rename to Xonotic after its original creator trademarked the name and started to make trouble.

Advice on cloning games: copy only the gameplay mechanics, otherwise make it original and very different from the cloned game or else you're threading the fine legal lines. See this as an opportunity to add something new, something that's yours, and potentially to apply and exploit minimalism, i.e. if you're going to clone Doom, do not make a game about shooting demons from hell that's called Gnoom -- just take the gameplay and do something new, e.g. why not try to make it a mix of sci-fi and fantasy with procedurally generated levels which will additionally save you a lot of time on level design?

Some Nice Gaymes

Anarch and microTD are examples of games trying to strictly follow the less retarded principles. SAF is a less retarded game library/fantasy console which comes with some less retarded games such as microTD.

Chess is pretty nice, if we can count it as a computer game.

{ I recommend checking out Xonotic, it's completely libre and one of the best games I've ever played, though it's bloat. ~drummyfish }

Non-Computer Games

TODO: board games (table comparing them), ...

See Also


gay

Gay

Homosexuality is a sexual orientation and disorder (of course, not necessarily a BAD one) which makes individuals sexually attracted primarily to the same sex, i.e. males to males and females to females. A homosexual individual is called gay (stylized as gaaaaaaaaay), homo or even faggot, females are called lesbians. About 4% of people suffer from homosexuality. The opposite of homosexuality, i.e. the normal, natural sexual orientation primarily towards the opposite sex, is called heterosexuality or being straight. Homosexuality is not to be confused with bisexuality -- here we are talking about pure homosexuality, i.e. a greatly prevailing attraction mostly towards the same sex.

For an unenlightened reader coming from the brainwashland: this article is not "offensive", it is just telling uncensored truth. Be reminded that LRS is not advocating any discrimination, on the contrary we advocate absolute social equality and love of all living beings. Your indoctrination has made you equate political incorrectness with oppression and hate; to see the truth, you have to unlearn this -- see for example our FAQ.

Unlike e.g. pedophilia and probably also bisexuality, pure homosexuality is NOT normal, it could be called a disorder (WE REPEAT, we love all people, even those with a disorder) -- of course the meaning of the word disorder is highly debatable, but pure homosexuality is firstly pretty rare, and secondly from the nature's point of view gay people wouldn't naturally reproduce, their condition is therefore equivalent to any other kind of sterility, which we most definitely would call a defect -- not necessarily a defect harmful to society (there are enough people already), but nonetheless a defect from biological point of view. Is it okay to have a defect? Of course it is. In this case society may even benefit because gayness prevents overpopulation.

You can usually tell someone's gay from appearance and/or his body language. Gay people are more inclined towards art and other sex's activities, for example gay guys are often hair dressers or even ballet dancers.

There is a terrorist fascist organization called LGBT aiming to make gay people superior to others, but more importantly to gain political power -- e.g. the power over language.

Is being gay a choice? Even though homosexuality is largely genetically determined, it may also be to some extent a choice, sometimes a choice that's not of the individual in question, a choice made at young age and irreversible at older age. Most people are actually bisexual to a considerable degree, with a preference of certain sex. When horny, you'd fuck pretty much anything. Still there is a certain probability in each individual of choosing one or the other sex for a sexual/life partner. However culture and social pressure can push these probabilities in either way. If a child grows up in a major influence of YouTubers and other celebrities that openly are gay, or promote gayness as something extremely cool and fashionable, you see ads with gays and if all your role models are gay and your culture constantly paints being homosexual as being more interesting and somehow "brave" and if the competition of sexes fueled e.g. by the feminist propaganda paints the opposite sex as literal Hitler, the child has a greater probability of (maybe involuntarily) choosing the gay side of his sexual personality.

{ I even observed this effect on myself a bit. I've always been completely straight, perhaps mildly bisexual when very horny. Without going into detail, after spending some time in a specific group of people, I found my sexual preference and what I found "hot" shifting towards the preference prevailing in that group. Take from that whatever you will. ~drummyfish }

Of course, we have nothing against gay people as we don't have anything against people with any other disorder -- we love all people equally. But we do have an issue with any kind of terrorist organization, so while we are okay with homosexuals, we are not okay with LGBT.

Are you gay? How can you tell? In doing so you should actually NOT be guided by your sexual desires -- as has been said, most people are bisexual and in sex it many times holds that what disgusts you normally turns you on when you're horny, i.e. if you're a guy and would enjoy sucking a dick, you're not necessarily gay, it may be pure curiosity or just the desire of "forbidden fruit"; this is quite normal. Whether you're gay is probably determined by what kind of LIFE partner you'd choose, i.e. what sex you can fall in a ROMANTIC relationship with. If you're a guy and fall in love with another guy -- i.e. you're passionate just about being with that guy (even in case you couldn't have sex with him) -- you're probably gay. (Of course this holds the other way around too: if you're a guy and enjoy playing with tits, you may not necessarily be straight.)


gaywashing

Gaywashing

TODO


geek

Geek

Geek is a wannabe nerd, it's someone who wants to identify with being smart rather than actually being smart. Geeks are basically what used to be called a smartass in the old days -- overly confident conformists occupying mount stupid who think soyence is actual science, they watch shows like Rick and Morty and Big Bang Theory, they browse Rational Wiki and reddit -- especially r/atheism, and they make appearances on r/iamverysmart -- they wear T-shirts with cheap references to 101 programming concepts and uncontrollably laugh at any reference to number 42, they think they're computer experts because they know the word Linux, managed to install Ubuntu or drag and drop programmed a "game" in Godot. Geeks don't really have their own opinions, they just adopt opinions presented on 9gag, they are extremely weak and don't have extreme views. They usually live the normal conformist life, they have friends, normal day job, wife and kids, but they like to say they "never fit in" -- a true nerd is living in a basement and doesn't meet any real life people, he lives on the edge of suicide and doesn't nearly complain as much as the "geek".


gemini

Gemini

Gemini is a shitty pseudominimalist network protocol for publishing, browsing and downloading files, a simpler alternative to the World Wide Web and a more complex alternative to gopher (by which it was inspired). It is a part of so called Smol Internet. Gemini aims to be a "modern take on gopher", adding some new "features" and a bit of bloat. The project states it wants to be something in the middle between Web and gopher but doesn't want to replace either.

On one hand Gemini is kind of cool but on the other hand it's pretty shit, especially by REQUIRING the use of TLS encryption for "muh security" because the project was made by privacy freaks that advocate the ENCRYPT ABSOLUTELY EVERYTHIIIIIING philosophy. This is firstly mostly unnecessary (it's not like you do Internet banking over Gemini) and secondly adds a shitton of bloat and prevents simple implementations of clients and servers. Some members of the community called for creating a non-encrypted Gemini version, but that would basically be just gopher. Not even the Web goes as far as REQUIRING encryption, so it may be better and easier to just create a simple web 1.0 website rather than a Gemini capsule. And if you want ultra simplicity, we highly advocate to instead prefer using gopher which doesn't suffer from the mentioned issue.


gender_studies

Gender Studies

what the actual fuck


gigachad

Gigachad

Gigachad is like chad, only more so. He has an ideal physique and makes women orgasm merely by looking at them.


girl

Girl

See femoid.


githopping

Githopping

Githopping is a disease similar to distrohopping but applied to git hosting websites. The disease has become an epidemics after the Micro$oft's take over of GitHub when people started protest-migrating to GitLab, however GitLab became shit as well so people started hopping to other services like Codeberg etcetc. and now they are addicted to just copying their code from one site to another instead of doing actual programming.

Cure: free yourself of any git hosting, don't centralize your repos on one hosting, use multiple git hostings as mirrors for your code, i.e. add multiple push remotes to your local git and with every push update your repos all over the internet. Just spray the internet with your code and let it sink in, let it be captured in caches and archive sites and let it be preserved. DO NOT tie yourself to any specific git hosting by using any non-git features such as issue trackers or specialized CLI tools such as github cli. DO NOT use git hosting sites as a social network, just stop attention whoring for stars and likes, leave this kind of shit to tiktokers.


git

Git

Git (name without any actual meaning) is a FOSS (GPL) version control system (system for maintaining and collaboratively developing source code of programs), currently the most mainstream-popular one (surveys saying over 90% developers use it over other systems). Git is basically a command line program allowing to submit and track changes of text files (usually source code in some programming language), offering the possibility to switch between versions, branches, detect and resolve conflicts in collaborative editing etc.

Git was created by Linus Torvalds in 2005 to host the development of Linux and to improve on systems such as svn. Since then it has become extremely popular, mainly thanks to GitHub, a website that offered hosting of git projects along with "social network" features for the developers; after this similar hosting sites such as GitLab and Codeberg appeared, skyrocketing the popularity of git even more.

It is generally considered quite a good software, many praise its distributed nature, ability to work offline etc., however diehard software idealists still criticize its mildly bloated nature and also its usage complexity -- it is non-trivial to learn to work with git and many errors are infamously being resolved in a "trial/error + google" style, so some still try to improve on it by creating new systems.

Is git literally hitler? Well, by suckless standards git IS bloated and yes, git IS complicated as fuck, however let's try to go deeper and ask the important questions, namely "does this matter so much?" and "should I use git or avoid it like the devil?". The answer is actually this: it doesn't matter too much that git is bloated and you don't have to avoid using it. Why? Well, git is basically just a way of hosting, spreading and mirroring your source onto many git-hosting servers (i.e. you can't avoid using git if you want to spread your code to e.g. Codeberg and GitLab) AND at the same time git doesn't create a dependency for your project, i.e. its shittiness doesn't "infect" your project -- if git dies or if you simply want to start using something else, you just copy-paste your source code elsewhere, you put it on FTP or anything else, no problem. It's similar to how e.g. Anarch uses SDL (which is bloated as hell) to run on specific platforms -- if it doesn't hard-depend on SDL and doesn't get tied to it, it's OK (and actually unavoidable) to make use of it. You also don't even have to get into the complicated stuff about git (like merging branches and resolving conflicts) when you're simply committing to a simple one-man project.

Which git hosting to use? All of them (except for GitHub which is a proprietary terrorist site)! Do not fall into the trap of githopping, just make tons of accounts, one on each git hosting site, add multiple push remotes and just keep pushing to all of them -- EZ. Remember, git hosting sites are just free file storage servers, not social platforms or brands to identify with. Do not use their non-git "features" such as issue trackers, CI and shit. They want you to use them as "facebook for programmers" and become dependent on their exclusive "features", so that's exactly what you want to avoid, just abuse their platform for free file storage.

Alternatives

Here are some alternatives to git:

How To Use Git For Solo/Small Projects (Cheatsheet)

TODO


global_discussion

Global Discussion

This is a place for general discussion about anything related to our thing. To comment just edit-add your comment. I suggest we use a tree-like structure as shows this example:

If the tree gets too big we can create a new tree under a new heading.

General Discussion


gnu

GNU

GNU ("GNU is Not Unix", a recursive acronym) is a large project started by Richard Stallman, the inventor of free (as in freedom) software, running since 1983 with the goal of creating a completely free (as in freedom) operating system, along with other free software that computer users might need. The project doesn't tolerate any proprietary software (though it sadly tolerates proprietary data). The project achieved its goal of creating a complete operating system when a kernel named Linux became part of it in the 90s as the last piece of the puzzle -- the system is now known as GNU/Linux. However, the GNU project didn't end and continues to further develop the operating system as well as a myriad of other software projects it hosts. GNU gave rise to the Free Software Foundation and is one of the most important software projects in history of computing.

The mascot of GNU is literally gnu (wildebeest), it is available under a copyleft license. WARNING: ironically GNU is extremely protective of their brand's "intellectual property" and will rape you if you use the name GNU without permission (see the case of GNU boot). It's quite funny and undermines the whole project a bit.

The GNU/Linux operating system has several variants in a form of a few GNU approved "Linux" ditributions such as Guix, Trisquel or Parabola. Most other "Linux" distros don't meet the strict standards of GNU such as not including any proprietary software. In fact the approved distros can't even use the standard version of Linux because that contains proprietary blobs, a modified variant called Linux-libre has to be used.

GNU greatly prefers GPL licenses, i.e. it strives for copyleft, even though it accepts even projects under permissive licenses. GNU also helps with enforcing these licenses legally and advises developers to transfer their copyright to GNU so that they can "defend" the software for them.

Although GNU is great and has been one of the best things to happen in software ever, it has many flaws. For example their programs are known to be kind of a bloat, at least from the strictly suckless perspective. It also doesn't mind proprietary non-functional data (e.g. assets in video games) and their obsession with copyleft also isn't completely aligned with LRS. GNU is also generally NOT supportive of free culture and use copyright to prohibit modifications of their propaganda texts: the GFDL license they use for texts may contain sections that are prohibited from being modified and so are non-free by definition. This sucks big time and shows some of fascist corruption withing the movement.

History

The project officially started on September 27, 1983 by Richard Stallman's announcement titled Free Unix!. In it he expresses the intent to create a free as in freedom clone of the operating system Unix, and calls for people to join his effort (he also uses the term free software here). Unix was a good, successful de-facto standard operating system, but it was proprietary, owned by AT&T, and as such restricted by licensing terms. GNU was to be a similar system, compatible with the original Unix, but free as in freedom, i.e. freely available and allowing anyone to use it, improve it and share it.

In 1985 Richard Stallman wrote the GNU Manifesto, similar to the original project announcement, which further promoted the project and asked people for help in development. At this point the GNU team already had a lot of software for the new system: a text editor Emacs, a debugger, a number of utility programs and a nearly finished shell and C compiler (gcc).

At this point each program of the project still had its own custom license that legally made the software free as in freedom. The differences in details of these licenses however caused issues such as legal incompatibilities. This was addressed in 1989 by Richard Stallman's creation of a universal free software license: GNU General Public License (GPL) version 1. This license can be used for any free software project and makes these projects legally compatible, while also utilizing so called copyleft: a requirement for derived works to keep the same license, i.e. a legal mechanism for preventing people from making copies of a free project non-free. Since then GPL has become the primary license of the GNU project as well as of other unrelated projects.

GNU Projects

GNU has developed an almost unbelievable amount of software, it has software for all basic and some advanced needs. As of writing this there are 373 software packages in the official GNU repository (at https://directory.fsf.org/wiki/Main_Page). Below are just a few notable projects under the GNU umbrella.

See Also


golang

Go (Programming Language)

Go (also golang) is a compiled programming language advertised as the the "modern" C and is co-authored by one of C's authors, Ken Thompson. Neverheless Go is actually shit compared to C. Some reasons for this are:

Anyway, it at least tries to stay somewhat simple in some areas and as such is probably better than other modern languages like Rust. It purposefully omits features such as generics or static type conversions, which is good.


go

Go

WIP

This article is about the game of go, for programming language see golang.

{ I am still learning the beautiful game of go, please excuse potential unintentional errors here. ~drummyfish }

Go (from Japanese Igo, "surrounding board game", also Baduk or Wei-qi) is possibly the world's oldest original-form two-player board game, coming from Asia, and is one of the most beautiful, elegant, deep and popular games of this type in history, whose cultural significance and popularity can be compared to that of chess, despite it largely remaining widely popular only in Asia (along with other games like shogi, or "Japanese chess"). There however, especially in Japan, go is pretty big, it appears a lot in anime, there are TV channels exclusively dedicated to go etc., though in Japan shogi (the "Japanese chess") is probably a bit more popular; nevertheless go is likely the most intellectually challenging board games among all of the biggest board games. Go is a bit difficult to get into (kind of like vim?) though the rules can be learned quite quickly; it is hard to make big-picture sense of the rule implications and it may take weeks to months before one can even call himself a beginner player. To become a master takes lifetime.

{ There is a nice non-bloated site hosting everything related to go: Sensei's Library at https://senseis.xmp.net/. ~drummyfish }

Compared to chess (some purists dislike this, see https://senseis.xmp.net/?CompareGoToChess) the rules of go are much more simple -- which is part of the game's beauty (see easy to learn, hard to master) -- though the emergent complexity of those few rules is grandiose; so much so that to play the game well is usually considered more challenging than learning chess well, as there are many more possibilities and mere calculation is not enough to be strong, one needs to develop a strong intuition; this is also the reason why it took 20 more years for computers to beat the best humans in go than in chess. Many say that go is yet deeper than chess and that it offers a unique experience that can't be found anywhere else; go is more mathematical, something that just exists naturally as a side effect of logic itself, while chess is a bit of an arbitrary set of more complex rules fine-tuned so that the game plays well. The spirit of go is also more zen-like and peaceful: while chess simulates war (something more aligned with western mentality), go is more about dividing territory, one could even see it not as a battle but rather a creation of art, beautiful patterns (something better aligned with eastern mentality).

From LRS point of view go is one of the best games ever, for similar reasons to chess (it's highly free, suckless, cheap, not owned by anyone, fun, mathematically deep, nice for programming while the game itself doesn't even require a computer etc.) plus yet greater simplicity and beauty.

Solving go: similarly to chess the full game of go seems unlikely to be solved -- the 19x19 board makes the game state tree yet larger than that of chess, but the much simpler rules possibly give a bigger hope for mathematical proofs. Smaller boards however have been solved: Erik van der Werf made a program that confirmed win for black on boards up to (and including) 5x5 (best first move in all cases being in the middle of the board). Bigger boards are being researched, but a lot of information about them is in undecipherable Japanese/Korean gibberish, so we leave that for the future.

A famous proverb about go goes like this: what is the most perfect game man ever invented? Chess! But what about go? Go existed long before man...

TODO: rating, programming, stats, programs and sites for playing, ...

Rules

The rules of go vary a bit more than those of chess, they are not as much unified, but usually the details don't play as much of a role because e.g. different scoring systems still mostly result in the same outcome of games. Here we'll describe possibly the most common rule set.

The game's goal is basically to surround a bigger territory than the enemy player. The formal rules are pretty simple, though their implications are very complex.

Go is played by a black and white player, black plays first (unlike in chess) and then both players take turns placing stones of one's own color on squares -- a square is the INTERSECTION of the lines on the board, NOT the place between them (consider the lines to be carved in stone, the intersection is where the stone stands with stability). The stones are all the same (there are no different types of stones like in chess) and they cannot move; once a stone is placed, it stays on its position until the end of the game, or until it is captured by the enemy player. The board size is 19x19, but for for students and quick games 13x13 and 9x9 boards are also used. As black plays first, he has a slight advantage; for this white gets bonus points at the end of the game, so called komi, which is usually set to be 6.5 points (the half point eliminates the possibility of a draw). Komi may differ depending on board size or a specific scoring system.

Any player can pass on his move, i.e. making a move isn't mandatory. However you basically always want to make a move, one only passes when he feels there is nothing more to be gained and the game should end. If both players pass consecutively, the game ends.

The game considers 4-neighborhoods, NOT 8-neighborhood, i.e. squares that don't lie on board edges have 4 neighbors: up, right, bottom and left; diagonal squares are NOT neighbors.

Capturing: a player can capture a group of connected (through 4-neighborhoods) enemy player's stones by completely surrounding them, or more precisely by taking away all so called liberties of that group -- liberty is an empty square that's immediately neighboring with the group (note that liberties may lie even inside the group). If a player places his stone so that it removes the enemy group's last liberty, then the group is removed from the board and all its stones are taken as captured. It is possible to capture stones by a move that would otherwise be forbidden as suicide, if after the removal of the captured group the placed stone gains a liberty.

Suicide is forbidden: it is not allowed to place a stone so that it would immediately result in that stone (or a group it would join) being captured by enemy. I.e. if there is an enemy group with one empty square in the middle of it, you cannot put a stone there as that stone would simply have no liberties and would immediately die. Exception to this is the above mentioned taking of a group, i.e. if a suicidal move results in immediately taking enemy's group, it is allowed.

The ko rule states that one mustn't make a move that returns the board to the immediately previous state; this basically applies just to the situation in which the enemy takes your stone and you would just place it back, retaking his capturing stone. By the ko rule you cannot do this IMMEDIATELY, but you can still do this any further following round. Some rulesets extend this rule to so called superko which prohibits repetition of ANY previously seen position (this covers some rare cases that can happen).

Territory: at any time any EMPTY square on the board belongs either to white (no black stone can be reached from it by traveling over neighbors), black (no white stone can be reached from it) or none (belongs to neither). Squares that have stone on them aren't normally considered to belong to anyone (though some scoring systems do), i.e. if you surround a territory as white, only the VACANT surrounded squares count as your territory. The size of territory plays a role in final scoring. An alternative to territory is area, which is territory plus the squares occupied by player's stones and which is used under some rulesets.

Prisoners are enemy's stones that are OBVIOUSLY in your territory and so are practically dead. I.e. they are inside what's clearly not their territory and with further play would clearly be captured. Obvious here is a matter of agreement between players -- if players disagree whether some stones are obvious prisoners, they simply keep playing and resolve the situation.

Scoring: scoring assigns points to each player when the game is over, the one with more points win. There are multiple scoring systems, most common are these two:

Handicaps: TODO.

Implications of rules and basic of strategy/tactics: life and death, eyes, atari, TODO.

Example: the following is an example of the end state of a beginner game on a 9x9 board:

   _________________
9 |. # . . # # # O .|
8 |# . # . # O O . O|
7 |. . . # # O . O .|
6 |# . . # O . O . O|
5 |# . # O O O . . .|
4 |. # # # O O . . .|
3 |. . . # # O O . .|
2 |. . . . # # O # .|
1 |. . . . # O O O .|
  '-----------------'
   A B C D E F G H I

Here black's (#) territory is 23, and black made 9 captures during the game, giving together 32 points. White's (O) territory is 16 and he has one black prisoner (H2), giving 17 points; furthermore white made 6 captures during the game and gets 5.5 (smaller value due to only 9x9 board size) bonus points as komi, totalling 28.5 point. Therefore black wins.

TODO

Play Tips

TODO

Go And Computers, Programming

See also https://senseis.xmp.net/?ComputerGoProgramming and https://www.chessprogramming.org/Go.

Board representation: a straightforward representation of the go board is as a simple array of squares; each square can be either empty, white or black, that's 3 values that can be stored with 2 bits, which allow storing 4 values, leaving one extra value to be used for some other purpose (e.g. marking illegal ko squares, estimated dead stones, marking last move etc.). 1 byte allows us to store 4 squares this way so we need only 91 bytes to represent the whole 19x19 board. On computers with enough RAM it may be considered to store 1 square in a single byte or int, making the board take more space but gaining speed thanks to data alignment (we don't need extra instructions for squeezing bit from/to a single byte). Of course we furthermore have to keep track of extra things such as numbers of captured stones.

TODO

Stats

Some interesting stats about go follow.

The longest possible game without passes has 4110473354993164457447863592014545992782310277120 moves. The longest recorded professional game seems to be mere 411 moves long (Hoshino Toshi vs Yamabe Toshiro, 1950). There are 2.08168199382 * 10^170 legal positions on a 19x19 board, 3.72497923077 * 10^79 for 13x13 and 1.03919148791 * 10^38 for 9x9. The number of possible games is estimated from 10^10^100 to 10^10^171. An average high-level game lasts about 150 moves. Average branching factor is 250 (compare to 35 in chess).

See Also


goodbye_world

Goodbye World

Goodbye world is a program that is in some sense an opposite of the traditional hello world program. What exactly this means is not strictly given, but some possibilities are:


good_enough

Good Enough

A good enough solution to a problem is a solution that solves the problem satisfyingly (not necessarily precisely or completely) while achieving minimal cost (effort, implementation time etc.). This is in contrast to looking for a better solutions for a higher cost, which we might call an overkill. For example a word-for-word translation of a text is a primitive way of translation, but it may be good enough to understand the meaning of the text; in many climates a tent is a good enough accommodation solution while a luxury house is a solution of better quality (more comfortable, safe, ...) for a higher cost.

To give an example from the world of programming, bubble sort is in many cases better than quick sort for its simplicity, even though it's much slower than more advanced sorts.

In technology we are often times looking for good enough solution to achieve minimalism and save valuable resources (computational resources, programmer time etc.). It rarely makes sense to look for solutions that are more expensive than they necessarily need to be, however in the context of capitalist software we see this happen many times as a part of killer feature battle and also driving prices artificially up for economic reasons (e.g. increasing the cost of maintenance of a software eliminates any competition that can't afford such cost). An example of this is the trend in smartphones to have 4 and more physical cameras. This is only natural in capitalism, we see the tendency for wasting resources everywhere. This of course needs to be stopped.


google

Google

Google (also Goolag) is one the very top big tech corporations, as well as one of the worst corporations in history (if not THE worst), comparable only to Micro$oft and Facebook. Google is gigantically evil and largely controls the Internet, pushes mass surveillance, personal data collection and abuse, ads, bloat, fascism and censorship.

Google's motto used to be "Don't be evil", but in 2018 they ditched it lol xD

Google raised to the top thanks to its search engine launched in the 90s. It soon got a monopoly on the Internet search and started pushing ads. Nowadays Google's search engine basically just promotes "content" on Google's own content platforms such as YouTube and of course censors sites deemed politically incorrect.

Besides heavily biasing web search results towards Google's own and friendly platforms, Google also heavily censors the search results and won't show links to prohibited sites unless you literally very specifically show that you want to find a prohibited site you already know of, for example you won't find results leading to Metapedia or Encyclopedia Dramatica unless you literally search for the url of those sites of long verbatim phrases they contain -- this is a trick played on those who "test" Google which at is mean to make it look as if Google actually isn't censored, however it is of course censored because the only people who will ever find the prohibited sites and their content are people who already know about it and are specifically searching for it just to test Google's censorship. { EDIT: tho Google also seems to refuse to give some URLs no matter what, e.g. https://infogalactic.com. Just tested it. ~drummyfish } If you intend to truly search the Internet, don't rely on Google's results but search with multiple engines (that have their own index) such as Mojeek, Yandex, Right Dao, wiby, YaCy, Qwant etc. (and of course search the darknet).

Google has created a malicious capitalist mobile "operating system" called Android, which they based on Linux with which they managed to bypass its copyleft by making Android de-facto dependent on their proprietary Play Store and other programs. I.e. they managed to take a free project and make a de-facto proprietary malware out of it -- a system that typically doesn't allow users to modify its internals and turn off its malicious features. With Android they invaded a huge number of devices from cells phones to TVs and have the ability to spy on the users of these devices.

Google also tries to steal the public domain: they scan and digitize old books whose copyright has expired and put the on the Internet archive, however in these scans they put a condition that the scans should not be used for commercial purposes, i.e. they try to keep exclusive commercial right for public domain works, something they have no right to do at all.


gopher

Gopher

Gopher is a network protocol for publishing, browsing and downloading files and is known as a much simpler alternative to the World Wide Web (i.e. to HTTP and HTML). In fact it competed with the Web in its early days and even though the Web won in the mainstream, gopher still remains used by a small community. Gopher is like the Web but well designed, it is the suckless/KISS way of doing what the Web does, it contains practically no bloat and so we highly advocate its use. Gopher inspired creation of Gemini, a similar but bit more complex and "modern" protocol, and the two together have recently become the main part of so called Smol Internet.

As of 2022 the Veronica search engine reported 343 gopher servers in the world with 5+ million indexed selectors.

Gopher doesn't use any encryption. This is good, encryption is bloat. Gopher also only uses ASCII, i.e. there's no Unicode. That's also good, Unicode is bloat (and mostly serves trannies to insert emojis of pregnant men into readmes, we don't need that). Gopher simple design is intentional, the authors deemed simplicity a good feature. Gopher is so simple that you may very well write your own client and server and comfortably use them (it is also practically possible to browse gopher without a specialized client, just with standard Unix CLI tools).

From the user's perspective the most important distinction from the Web is that gopher is based on menus instead of "webpages"; a menu is simply a column of items of different predefined types, most importantly e.g. a text file (which clients can directly display), directory (link to another menu), text label (just shows some text), binary file etc. A menu can't be formatted or visually changed, there are no colors, images, scripts or hypertext -- a menu is not a presentation tool, it is simply a navigation node towards files users are searching for (but the mentioned ASCII art and label items allow for somewhat mimicking "websites" anyway). Addressing works with URLs just as the Web, the URLs just differ by the protocol part (gopher:// instead of http://), e.g.: gopher://gopher.floodgap.com:70/1/gstats. What on Web is called a "website" on gopher we call a gopherhole (i.e. a collection of resources usually under a single domain) and the whole gopher network is called a gopherspace. Blogs are common on gopher and are called phlogs (collectively a phlogosphere). As menus can refer to one another, gopher creates something akin a global file system, so browsing gopher is like browsing folders and can comfortably be handled with just 4 arrow keys. Note that as menus can link to any other menu freely, the structure of the "file system" is not a tree but rather a general graph. Another difference from the Web is gopher's great emphasis on plaintext and ASCII art as it cannot embed images and other media in the menus (even though of course the menus can link to them). There is also a support for sending text to a server so it is possible to implement search engines, guest books etc.

Gopher is just an application layer protocol (officially running on port 70 assigned by IANA), i.e it sits above lower layer protocols like TCP and takes the same role as HTTP on the Web and so only defines how clients and servers talk to each other -- the gopher protocol doesn't say how menus are written or stored on servers. Nevertheless for the creation of menus so called gophermaps have been established, which is a simple format for writing menus and are the gopher equivalent of Web's HTML files (just much simpler, basically just menu items on separate lines, the exact syntax is ultimately defined by server implementation). A server doesn't have to use gophermaps, it may be e.g. configured to create menus automatically from directories and files stored on the server, however gophermaps allow users to write custom menus manually. Typically in someone's gopherhole you'll be served a welcoming intro menu similar to a personal webpage that's been written as a gophermap, which may then link to directiories storing personal files or other hand written menus. Some gopher servers also allow creating dynamic content with scripts called moles.

Gopher software: sadly "modern" browsers are so modern they have millions of lines of code but can't be bothered to support such a trivial protocol like gopher, however there are Web proxies you can use to explore gopherspace. Better browsers such as lynx (terminal) or forg (GUI) can be used for browsing gopherspace natively. As a server you may use e.g. Gophernicus (used by SDF) or search for another one, there are dozens. For the creation of gophermaps you simply use a plaintext editor. Where to host gopher? Pubnixes such as SDF, tilde.town and Circumlunar community offer gopher hosting but many people simply self-host servers e.g. on Raspberry Pis, it's pretty simple.

Example

TODO


graphics

Computer Graphics

Computer graphics (CG or just graphics) is a field of computer science that focuses on visual information. The field doesn't have strict boundaries and can blend and overlap with other possibly separate topics such as physics simulations, multimedia and machine learning. It usually deals with creating or analyzing 2D and 3D images and as such CG is used in data visualization, game development, virtual reality, optical character recognition and even astrophysics or medicine.

We can divide computer graphics in different ways, traditionally e.g.:

Since the 90s computers started using a dedicated hardware to accelerate graphics: so called graphics processing units (GPUs). These have allowed rendering of high quality images in high FPS, and due to the entertainment and media industry (especially gaming), GPUs have been pushed towards greater performance each year. Nowadays they are one of the most consumerist hardware, also due to the emergence of general purpose computations being moved to GPUs (GPGPU), lately especially mining of cryptocurrencies and training of AI. Most lazy programs dealing with graphics nowadays simply expect and require a GPU, which creates a bad dependency and bloat. At LRS we try to prefer the suckless software rendering, i.e. rendering on the CPU, without GPU, or at least offer this as an option in case GPU isn't available. This many times leads us towards the adventure of using old and forgotten algorithms used in times before GPUs.

3D Graphics

This is a general overview of 3D graphics, for more technical overview of 3D rendering see its own article.

3D graphics is a big part of CG but is a lot more complicated than 2D. It tries to achieve realism through the use of perspective, i.e. looking at least a bit like what we see in the real world. 3D graphics can very often bee seen as simulating the behavior of light; there exists so called rendering equation that describes how light behaves ideally, and 3D computer graphics tries to approximate the solutions of this equation, i.e. the idea is to use math and physics to describe real-life behavior of light and then simulate this model to literally create "virtual photos". The theory of realistic rendering is centered around the rendering equation and achieving global illumination (accurately computing the interaction of light not just in small parts of space but in the scene as a whole) -- studying this requires basic knowledge of radiometry and photometry (fields that define various measures and units related to light such as radiance, radiant intensity etc.).

In 2010s mainstream 3D graphics started to employ so called physically based rendering (PBR) that tries to yet more use physically correct models of materials (e.g. physically measured BRDFs of various materials) to achieve higher photorealism. This is in contrast to simpler (both mathematically and computationally), more empirical models (such as a single texture + phong lighting) used in earlier 3D graphics.

Because 3D is not very easy (for example rotations are pretty complicated), there exist many 3D engines and libraries that you'll probably want to use. These engines/libraries work on different levels of abstraction: the lowest ones, such as OpenGL and Vulkan, offer a portable API for communicating with the GPU that lets you quickly draw triangles and write small programs that run in parallel on the GPU -- so called shaders. The higher level, such as OpenSceneGraph, work with abstraction such as that of a virtual camera and virtual scene into which we place specific 3D objects such as models and lights (the scene is many times represented as a hierarchical graph of objects that can be "attached" to other objects, so called scene graph).

There is a tiny suckless/LRS library for real-time 3D: small3dlib. It uses software rendering (no GPU) and can be used for simple 3D programs that can run even on low-spec embedded devices. TinyGL is a similar software-rendering library that implements a subset of OpenGL.

Real-time 3D typically uses an object-order rendering, i.e. iterating over objects in the scene and drawing them onto the screen (i.e. we draw object by object). This is a fast approach but has disadvantages such as (usually) needing a memory inefficient z-buffer to not overwrite closer objects with more distant ones. It is also pretty difficult to implement effects such as shadows or reflections in object-order rendering. The 3D models used in real-time 3D are practically always made of triangles (or other polygons) because the established GPU pipelines work on the principle of drawing polygons.

Offline rendering (non-real-time, e.g. 3D movies) on the other hand mostly uses image-order algorithms which go pixel by pixel and for each one determine what color the pixel should have. This is basically done by casting a ray from the camera's position through the "pixel" position and calculating which objects in the scene get hit by the ray; this then determines the color of the pixel. This more accurately models how rays of light behave in real life (even though in real life the rays go the opposite way: from lights to the camera, but this is extremely inefficient to simulate). The advantage of this process is a much higher realism and the implementation simplicity of many effects like shadows, reflections and refractions, and also the possibility of having other than polygonal 3D models (in fact smooth, mathematically described shapes are normally much easier to check ray intersections with). Algorithms in this category include ray tracing or path tracing. In recent years we've seen these methods brought, in a limited way, to real-time graphics on the high end GPUs.


greenwashing

Greenwashing

"For every car you consume we plant a tree." --corporations

TODO


gui

Graphical User Interface

Graphical user interface (GUI) is a visual user interface that uses graphics such as images and geometrical shapes. This stands in contrast with text user interface (TUI) which is also visual but only uses text for communication.

Expert computer users normally frown upon GUI because it is the "noobish", inefficient, limiting, cumbersome, hard to automate way of interacting with computer. GUI brings complexity and bloat, they are slow, inefficient and distracting. We try not to use them and prefer the command line.

"Modern" GUIs mostly use callback-based programming, which again is more complicated than standard polling non-interactive I/O. If you need to do GUI, just use a normal infinite loop FFS.

When And How To Do GUI

GUI is not forbidden, it has its place, but today it's way too overused -- it should be used only if completely necessary (e.g. in a painting program) or as a completely optional thing built upon a more suckless text interface or API. So remember: first create a program and/or a library working without GUI and only then consider creating an optional GUI frontend. GUI must never be tied to whatever functionality can be implemented without it.

Still, when making a GUI, you can make it suckless and lighthweight. Do your buttons need to have reflections, soft shadows and rounded anti-aliased borders? No. Do your windows need to be transparent with light-refraction simulation? No. Do you need to introduce many MB of dependencies and pain such as QT? No.

The ergonomics and aesthetic design of GUIs has its own field and can't be covered here, but just keep in mind some basic things:

The million dollar question is: which GUI framework to use? Ideally none. GUI is just pixels, buttons are just rectangles; make your GUI simple enough so that you don't need any shitty abstraction such as widget hierarchies etc. If you absolutely need some framework, look for a suckless one; e.g. nuklear is worth checking out. The suckless community sometimes uses pure X11, however that's not ideal, X11 itself is kind of bloated and it's also getting obsoleted by Wayland. The ideal solution is to make your GUI backend agnostic, i.e. create your own very thin abstraction layer above the backend (e.g. X11) so that any other backend can be plugged in if needed just by rewriting a few simple functions of your abstraction layer (see how e.g. Anarch does rendering).

State Of Mainstream GUI

Nowadays there are a great many GUI libraries, frameworks, standards and paradigms, and it may be a bit hard to digest them at once.

TODO: some general shit bout graphical windows vs the "single window" mobile and web UI, analysis of the "GUI stack" (Linux framebuffer, X window, widget toolkits etc.), basic widgets etc.


hacker_culture

Hacker Culture

See hacking.


hacking

Hacking

Not to be confused with cracking.

Hacking (also hackerdom) in the widest sense means exploiting usually (but not necessarily) a computer system in a clever, "thinking outside the box" way. In context of computers the word hacker was originally -- that is in 1960s -- used for very good programmers and people who were simply good with computers, the word hacking had a completely positive meaning; hacker could almost be synonymous with computer genius (at the time people handling computers were usually physicists, engineers or mathematicians), someone who enjoyed handling and programming computers and could playfully look for very clever ways of making them do what he wanted. Over time hackers evolved a whole hacker culture with its own slang, set of values, behavioral and ethical norms, in jokes and rich lore. As time marched on, computer security has started to become an important topic and some media started to use the word hacker for someone breaking into a computer system and so the word gained a negative connotation in the mainstream -- though many refused to accept this new meaning and rather used the word cracker for a "malicious hacker", there appeared new variants such as white hat and black hat hacker, referring to ethical and malicious hackers. With onset of online games the word hacking even became a synonym for cheating. The original positive meaning has recently seen some comeback with popularity of sites such as hacker news or hackaday, the word life hack has even found its way into the non-computer mainstream dictionary, however a "modern hacker" is a bit different from the oldschool hacker, usually for the worse (for example a modern self proclaimed "hacker" has no issue with wearing a suit, something that would be despised by an oldschool hacker). We, LRS, advocate for using the original, oldschool meaning of the word hacker.

Original Hacker Culture

The original hacker culture is a culture of the earliest computer programmers, usually smart but socially rather isolated nerds -- at the time mostly physicists, mathematicians and engineers -- who shared deep love for programming and pure joy of coming up with clever computer tricks, exploration of computers and freely sharing their knowledge and computer programs with each other. The culture started to develop rapidly at MIT in about the second half of 1960s, though other hacker communities existed earlier and in other places as well (still mostly at universities).

The word hack itself seems to have come from a model train club at MIT in whose slang the word referred to something like a project of passion without a specific goal; before this the word was used around MIT for a specific kind of clever but harmless pranks. Members of the model train club came to contact with early computers at MIT and brought their slang along. These early punch-card computers were expensive and sacred, hackers treated them as almost supernatural entities; in the book Hackers it is mentioned that those who were allowed to operate the machines were called Priests -- Priests would often carry out a little prayer to please the machine so that it would bless them with computation. During 60s and 70s so called phreaking -- hacking the phone network -- was popular among hackers.

Many ideas -- such as the beauty of minimalism -- that became part of hacker culture later came from the development of Unix and establishment of its programming philosophy. Many hackers came from the communities revolving around PDP 10 and ARPANET, and later around networks such as Usenet. At the time when computers started to be abused by corporations, Richard Stallman's definition of free software and his GNU project embodied the strong hacker belief in information freedom and their opposition of intellectual property.

The culture has a deep lore and its own literature consisting of books that hackers usually like (e.g. The Hitchhiker's Guide to the Galaxy) and books by hackers themselves. Bits of the lore are in forms of short stories circulated as folklore, very popular form are so called Koans. Perhaps the most iconic hacker story is the Story of Mel which tells a true story of a master hacker keeping to his personal ethical beliefs under the pressure of his corporate employers -- a conflict between manager employers ("suits") and hacker employees is a common theme in the stories. Other famous stories include the TV typewriter and Magic Switch. One of the most famous hacker books is the Jargon File, a collectively written dictionary documenting hacker culture in detail. A 1987 book The Tao of Programming captures the hacker wisdom with Taoist-like texts that show how spiritual hacking can get -- this reflects the above mentioned sacred nature of the early computers. The textfiles website features many text files on hacking at https://textfiles.vistech.net/hacking/. See also Ten Commandments for C Programmers etc. A lot about hackers can be learned from books about them, e.g. the free book Free as in Freedom about Richard Stallman (available e.g. here). A prominent hacker writer is Eric S. Raymond who produced a very famous essay The Cathedral and the Bazaar, edited the Jargon File and has written a large guide called How To Become A Hacker -- these are all good resources on hackerdom, even though Raymond himself is kind of shitty, he for example prefers the "open source" movement to free software.

As a symbol of hackerdom the glider symbol from game of life is sometimes used, it looks like this:

 _____
|_|0|_|
|_|_|0|
|0|0|0|

Let us now attempt to briefly summarize what it means to be a hacker:

Let's mention a few people who were at their time regarded by at least some as true hackers, however note that many of them betrayed some of the hacker ways either later in life or even in their young years -- people aren't perfect and no single individual is a perfect example of a whole culture. With that said, those regarded hackers included Melvin Kaye aka Mel, Richard Stallman, Linus Torvalds, Eric S. Raymond, Ken Thompson, Dennis Ritchie, Richard Greenblatt, Bill Gosper, Steve Wozniak or Larry Wall.

"Modern" "Hackers"

Many modern zoomer soydevs call themselved "hackers" but there are basically none that would stay true to the original ethics and culture and be worthy of being called a true hacker, they just abuse the word as a cool term or a brand (see e.g. "hacker" news). It's pretty sad the word has become a laughable parody of its original meaning by being associated with groups such as Anonymous who are just a bunch of 14 year old children trying to look like "movie hackers". The hacker culture has been spoiled basically in the same ways the rest of society, and the difference between classic hacker culture and the "modern" one is similar to the difference between free software and open source, though perhaps more amplified -- the original culture of strong ethics has become twisted by capitalist trends such as self-interest, commercialization, fashion, mainstreamization, even shitty movie adaptations etc. The modern "hackers" are idiots who have never seen assembly, can't do math, they're turds in suits who make startups and work as influencers, they are tech consumers who use and even create bloat, and possibly even proprietary software. For the love of god, do NOT mimic such caricatures or give them attention -- not only are they not real hackers, they are simply retarded attention whores.

Security "Hackers"

Hacker nowadays very often refers to someone involved in computer security either as that who "protects" (mostly by looking for vulnerabilities and reporting them), so called white hat, or that who attacks, so called black hat. Those are not hackers in the original sense, they are hackers in the mainstream adopted meaning of someone breaking into a system. This kind of "hacker" betrays the original culture by supporting secrecy and censorship, i.e. "protection" of "sensitive information" mostly justified by so called "privacy" -- this is violating the original hacker's pursuit of absolute information freedom (note that e.g. Richard Stallman boycotted even the use of passwords at MIT, Raymond discourages from using anonymous handles and rather recommends going by your real name). These people are obsessed with anonymity, encryption, cryptocurrencies, cryptofascism and are also more often than not egoist people with shitty personalities. In addition they don't generally adhere to the original hacker culture in any way either, they are simply people breaking into systems for some kind of self benefit (yes, even the white hats), nothing more than that. Again, do NOT try to mimic these abominations.

Examples Of Hacks

{ As a redditfag I used to follow the r/devtricks subreddit, it contained some nice examples of hacks. ~drummyfish }

A great many commonly used tricks in programming could be regarded as hacks even though many are not called so because they are already well known and no longer innovative, a true hack is something new that impresses fellow hackers. And of course hacks may appear outside the area of technology as well. The following is a list of things that were once considered new hacks or that are good examples demonstrating the concept:

See Also


hack

Hack

See hacking.


hard_to_learn_easy_to_master

Hard To Learn, Easy To Master

"Hard to learn, easy to master" is the opposite of "easy to learn, hard to master".

Example: drinking coffee while flying a plane.


hardware

Hardware

The article is here!


harry_potter

Harry Potter

Harry Potter is a franchise and universe by an English female writer J. K. Rowling about wizards and magic { like ACTUAL wizards and magic. ~drummyfish } that started in 1997 as an immensely successful series of seven children and young adult books, was followed by movies and later on by many other spinoff media such as video games. It made J. K. Rowling a billionaire and has become the most famous and successful book series of modern age. At first the books sparked controversies and opposition in religious communities for "promoting witchcraft", in recent years the universe and stories have become a subject of wider political analysis and fights, as most other things.

{ The books are actually good -- not the best in the world, I've read many better ones that would better deserve this kind of attention, but still the work is admirable. There is of course tons of money in the franchise so it's getting raped and milked like any other IP capital -- this is of course spoiling and killing the work, so be careful. ~drummyfish }

Plot summary: sorry, we're not writing a plot summary here, thank copyright laws -- yes, fair use allows us to do it but it would make us non free :) Let's just say the story revolves around a boy named Harry Potter who goes to a wizard school with two friends and they're together saving the world from Lord Voldemort, the wizard equivalent of Hitler. Overall the books start on a very light note and get progressively darker and more adult, turning into a story about "World War II but with wizards'n'magic". It's pretty readable, with great, unique atmosphere, pleasant coziness and elements of many literary genres, there's nice humor and good ideas. Also the lore is very deep.


hash

Hash

Hash is a number that's computed from some data in a chaotic way and which is used for many different purposes, e.g. for quick comparisons (instead of comparing big data structures we just compare their hashes) or mapping data structures to table indices.

Hash is computed by a hash function, a function that takes some data and turns it into a number (the hash) that's in terms of bit width much smaller than the data itself, has a fixed size (number of bits) and which has additional properties such as being completely different from hash values computed from very similar (but slightly different) data. Thanks to these properties hashes have a very wide use in computer science -- they are often used to quickly compare whether two pieces of non-small data, such as documents, are the same, they are used in indexing structures such as hash tables which allow for quick search of data, and they find a great use in cryptocurrencies and security, e.g. for digital signatures or storing passwords (for security reasons in databases of users we store just hashes of their passwords, never the passwords themselves). Hashing is extremely important and as a programmer you won't be able to avoid encountering hashes somewhere in the wild.

{ Talking about wilderness, hyenas have their specific smells that are determined by bacteria in them and are unique to each individual depending on the exact mix of the bacteria. They use these smells to quickly identify each other. The smell is kind of like the animal's hash. But of course the analogy isn't perfect, for example similar mixes of bacteria may produce similar smells, which is not how hashes should behave. ~drummyfish }

It is good to know that we distinguish between "normal" hashes used for things such as indexing data and cryptographic hashes that are used in computer security and have to satisfy some stricter mathematical criteria. For the sake of simplicity we will sometimes ignore this distinction here. Just know it exists.

It is generally given that a hash (or hash function) should satisfy the following criteria:

Hashes are similar to checksums but are different: checksums are simpler because their only purpose is for checking data integrity, they don't have to have a chaotic behavior, uniform mapping and they are often easy to reverse. Hashes are also different from database IDs: IDs are just sequentially assigned numbers that aren't derived from the data itself, they don't satisfy the hash properties and they have to be absolutely unique. The term pseudohash may also be encountered, it seems to be used for values similar to true hashes which however don't quite satisfy the definition.

{ I wasn't able to find an exact definition of pseudohash, but I've used the term myself e.g. when I needed a function to make a string into a corresponding fixed length string ID: I took the first N characters of the string and appended M characters representing some characteristic of the original string such as its length or checksum -- this is what I called the string's pseudohash. ~drummyfish }

Some common uses of hashes are:

Example

Let's say we want a hash function for string which for any ASCII string will output a 32 bit hash. How to do this? We need to make sure that every character of the string will affect the resulting hash.

First thought that may come to mind could be for example to multiply the ASCII values of all the characters in the string. However there are at least two mistakes in this: firstly short strings will result in small values as we'll get a product of fewer numbers (so similar strings such as "A" and "B" will give similar hashes, which we don't want). Secondly reordering the characters in a string (i.e. its permutations) will not change the hash at all (as with multiplication order is insignificant)! These violate the properties we want in a hash function. If we used this function to implement a hash table and then tried to store strings such as "abc", "bca" and "cab", all would map to the same hash and cause collisions that would negate the benefits of a hash table.

A better hash function for strings is shown in the section below.

Nice Hashes

{ Reminder: I make sure everything on this Wiki is pretty copy-paste safe, from the code I find on the Internet I only copy extremely short (probably uncopyrightable) snippets of public domain (or at least free) code and additionally also reformat and change them a bit, so don't be afraid of the snippets. ~drummyfish }

Here is a simple and pretty nice 8bit hash, it outputs all possible values and all its bits look quite random: { Made by me. ~drummyfish }

uint8_t hash(uint8_t n)
{
  n *= 23;
  n = ((n >> 4) | (n << 4)) * 11;
  n = ((n >> 1) | (n << 7)) * 9;

  return n;
}

The hash prospector project (unlicense) created a way for automatic generation of integer hash functions with nice statistical properties which work by XORing the input value with a bit-shift of itself, then multiplying it by a constant and repeating this a few times. The functions are of the format:

uint32_t hash(uint32_t n)
{
  n = A * (n ^ (n >> S1));
  n = B * (n ^ (n >> S2));
  return n ^ (n >> S3);
}

Where A, B, S1, S2 and S3 are constants specific to each function. Some nice constants found by the project are:

A B S1 S2 S3
303484085 985455785 15 15 15
88290731 342730379 16 15 16
2626628917 1561544373 16 15 17
3699747495 1717085643 16 15 15

The project also explores 16 bit hashes, here is a nice hash that doesn't even use multiplication!

uint16_t hash(uint16_t n)
{
  n = n + (n << 7); 
  n = n ^ (n >> 8);
  n = n + (n << 3); 
  n = n ^ (n >> 2);
  n = n + (n << 4);
  return n ^ (n >> 8);
}

Here is a nice string hash, works even for short strings, all bits look pretty random: { Made by me. ~drummyfish }

uint32_t strHash(const char *s)
{
  uint32_t r = 21;

  while (*s)
  {
    r = (r * 31) + *s;
    s++;
  }

  r = r * 4451;
  r = ((r << 19) | (r >> 13)) * 5059;

  return r;
}

TODO: more


hero_culture

Hero Culture

Hero culture is a harmful culture of creating and worshiping heroes and "leaders" (and other kinds of celebrities) which leads to e.g. creation of cults of personality, strengthening fight culture and establishing hierarchical, anti-anarchist society of "winners" and "losers". The concept of a hero is one that arose in context of wars and other many times violent conflicts; a hero is different from a mere authority or a well known individual in some area, it is someone who creates fear of disagreement and whose image is distorted to a much more positive, sometimes godlike state, by which he distorts truth and is given a certain power over others. Therefore we highly warn about falling to the trap of hero culture, though this is very difficult in current highly hierarchical society. To us, the word hero has a pejorative meaning. Our advice is always this:

Do NOT create heroes. Follow ideas, not people. And similarly: hate ideas, not people.

Smart people know this and those being named heroes themselves many times protest it, e.g. Marie Curie has famously stated: "be less curious about people and more curious about ideas." Anarchists purposefully don't name theories after their inventors but rather by their principles, knowing the danger of hero culture leading to social hierarchy and also that people are imperfect -- people are like packages, a mixture of both good and bad inadvertently inseparable, they carry distorting associations, they make mistakes and their images are twisted by history and politics -- even the character of Jesus, a "theoretically perfect human", has been many times twisted in ways that are hard to believe. Worshiping an individual always comes with the tendency to embrace and support everything he does, all his opinions and actions, including the extremely bad ones. Abusive regimes are the ones who use heroes and their names for propaganda -- Stalinism, Leninism, corporations such as Ford, named after their founder etc. Heroes become brands whose stamp of approval is used to push bad ideas... especially popular are heroes who are already dead and can't protest their image being abused -- see for example how Einstein's image has been raped by capitalists for their own propaganda, e.g. by Apple's marketing, while in fact Einstein was a pacifist socialist highly critical of capitalism. This is not to say an idea's name cannot be abused, the word communism has for example become something akin a swear word after being abused by regimes that had little to do with real communism. Nevertheless it is still much better to focus on ideas as ideas always carry their own principle embedded within them, visible to anyone willing to look, and can be separated from other ideas very easily. Focusing on ideas allows us to discuss them critically, it allows us to reject a bad concept without "attacking" the human who came up with it.

Mainstream US mentality of strong hero culture is now infecting the whole world and reaches unbelievably retarded levels, which is further not helped by shit like the stupid superhero movies. Besides calling murderers (soldiers) heroes, it is now for example standard to call handicapped people heroes, literally only because they are handicapped and it makes them feel better, even if they do nothing special and even if they actually live more comfortable lives than poor healthy peasants who have to live miserably and slave at work every day without getting anyone's attention. Or -- and this is yet another level of stupidity -- anyone who just happens to not behave like a dick in case of some emergency is guaranteed to be called a hero; for example if someone by chance walks by a baby that is drowning in a pool and saves the baby from dying will with 100% probability be called a hero in the media. But WHY the fuck would that be? Is the guy a hero because he didn't just sit down a watch the baby drown? It is the absolutely normal behavior to save a drowning baby if one sees it, especially when there is very little risk of own life in doing so (such as just jumping into the pool); calling someone a hero for doing so is like calling a gun owner a hero for not going to the streets to randomly shoot at people. So in this fucked up society the title of hero is basically won like a lottery -- you just have to be lucky enough to be present at some emergency and then just do the normal thing.


hero

Hero

HEROES ARE HARMFUL. See hero culture.


hexadecimal

Hexadecimal

TODO

Some hexadecimal values that are also English words at the same time and which you may include in your programs for fun include: ace, add, babe, bad, be, bee, beef, cab, cafe, dad, dead, deaf, decade, facade, face, fee, feed. You may also utilize digits here (recall the famous number 80085 that looks like BOOBS); 0 = O, 1 = I, 2 = Z, 5 = S, 6 = G, 8 = B (already available though).


history

History

{ Though history is usually written by the winners, this one was written by a loser :) Keep in mind there may appear errors, you can send me an email if you find some. ~drummyfish }

This is a brief summary of history of technology and computers.

The earliest known appearance of technology related to humans is the use of stone tools of hominids in Africa some two and a half million years ago. Learning to start and control fire was one of the most important advances of earliest humans; this probably happened hundreds of thousands to millions years ago, even before modern humans. Around 8000 BC the Agricultural Revolution happened: this was a disaster -- as humans domesticated animals and plants, they had to abandon the comfortable life of hunters and gatherers and started to suffer greatly from the extremely hard work on their fields (this can be seen e.g. from their bones). This led to the establishment of first cities. Primitive writing can be traced to about 7000 BC to China. Wheel was another extremely useful technology humans invented, it is not known exactly when or where it appeared, but it might have been some time after 5000 BC -- in Ancient Egypt The Great Pyramid was built around 2570 BC still without the knowledge of wheel. Around 4000 BC history starts with first written records. Humans learned to smelt and use metals approximately 3300 BC (Bronze Age) and 1200 BC (Iron Age). Abacus, one of the simplest devices aiding with computation, was invented roughly around 2500 BC. However people used primitive computation helping tools, such as bone ribs, probably almost from the time they started trading. Babylonians in around 2000 BC were already able to solve some forms of quadratic equations.

After 600 BC the Ancient Greek philosophy starts to develop which would lead to strengthening of rational, scientific thinking and advancement of logic and mathematics. Around 300 BC Euklid wrote his famous Elements, a mathematical work that proves theorems from basic axioms. Around 400 BC camera obscura was already described in a written text from China where gears also seem to have been invented soon after. Ancient Greeks could communicate over great distances using Phryctoria, chains of fire towers placed on mountains that forwarded messages to one another using light. 234 BC Archimedes described the famous Archimedes screw and created an algorithm for computing the number pi. In 2nd century BC the Antikythera mechanism, the first known analog computer is made to predict movement of heavenly bodies. Romans are known to have been great builders, they built many roads and such structures as the Pantheon (126 AD) and aqueducts with the use of their own type of concrete and advanced understanding of physics.

Around 50 AD Heron of Alexandria, an Egyptian mathematician, created a number of highly sophisticated inventions such as a vending machine that accepted coins and gave out holy water, and a cart that could be "programmed" with strings to drive on its own.

In the 3rd century Chinese mathematician Liu Hui describes operations with negative numbers, even though negative numbers have already appeared before. In 600s AD an Indian astronomer Brahmagupta first used the number zero in a systematic way, even though hints on the number zero without deeper understanding of it appeared much earlier. In 9th century the Mayan empire is collapsing, though it would somewhat recover and reshape.

Around the year of our Lord 1450 a major technological leap known as the Printing Revolution occurred. Johannes Gutenberg, a German goldsmith, perfected the process of producing books in large quantities with the movable type press. This made books cheap to publish and buy and contributed to fast spread of information and better education. Around this time the Great Wall of China is being built.

They year 1492 marks the discovery of America by Christopher Columbus who sailed over the Atlantic Ocean, though he probably wasn't the first in history to do so, and it wasn't realized he sailed to America before his death.

During 1700s a major shift in civilization occurred, called the Industrial Revolution -- this was another disaster that would lead to the transformation of common people to factory slaves and loss of their self sufficiency. The revolution spanned roughly from 1750 to 1850. It was a process of rapid change in the whole society due to new technological inventions that also led to big changes in how people lived their everyday lives. It started in Great Britain but quickly spread over the whole world. One of the main changes was the transition from manual manufacturing to factory manufacturing using machines and sources of energy such as coal. Steam engine played a key role. Work became a form of a highly organized slavery system, society became industrionalized. This revolution became highly criticized as it unfortunately opened the door for capitalism, made people dependent on the system as everyone had to become a specialized cog in the society machine, at this time people started to measure time in minutes and lead very planned lives with less joy. But there was no way back.

In 1712 Thomas Newcomen invented the first widely used steam engine used mostly for pumping water, even though steam powered machines have already been invented long time ago. The engine was significantly improved by James Watt in 1776. Around 1770 Nicolas-Joseph Cugnot created a first somewhat working steam-powered car. In 1784 William Murdoch built a small prototype of a steam locomotive which would be perfected over the following decades, leading to a transportation revolution; people would be able to travel far away for work, the world would become smaller which would be the start of globalization. The railway system would make common people measure time with minute precision.

In 1792 Clause Chappe invented optical telegraph, also called semaphore. The system consisted of towers spaced up to by 32 km which forwarded textual messages by arranging big arms on top of the towers to signal specific letters. With this messages between Paris and Strasbourg, i.e. almost 500 km, could be transferred in under half an hour. The system was reserved for the government, however in 1834 it was hacked by two bankers who bribed the tower operators to transmit information about stock market along with the main message (by setting specific positions of arms that otherwise didn't carry any meaning), so that they could get an advantage on the market.

By 1800 Alessandro Volta invented an electric battery. In 1827 André-Marie Ampère publishes a further work shedding light on electromagnetism. After this electric telegraph would be worked on and improved by several people and eventually made to work in practice. In 1821 Michael Faraday invented the electromotor. Georg Ohm and especially James Maxwell would subsequently push the knowledge of electricity even further.

In 1822 Charles Babbage, a great English mathematician, completed the first version of a manually powered digital mechanical computer called the Difference Engine to help with the computation of polynomial derivatives to create mathematical tables used e.g. in navigation. It was met with success and further development was funded by the government, however difficulties of the construction led to never finishing the whole project. In 1837 Babbage designed a new machine, this time a Turing complete general purpose computer, i.e. allowing for programming with branches and loops, a true marvel of technology. It also ended up not being built completely, but it showed a lot about what computers would be, e.g. it had an assembly-like programming language, memory etc. For this computer Ada Lovelace would famously write the Bernoulli number algorithm.

In 1826 or 1827 French inventor Nicéphore Niépce captured first photography that survived until today -- a view from his estate named Le Gras. About an 8 hour exposure was used (some say it may have taken several days). He used a camera obscura and asphalt plate that hardened where the light was shining. Earlier cases of photography existed maybe as early as 1717, but they were only short lived.

Sound recording with phonatograph was invented in 1857 in Paris, however it could not be played back at the time -- the first record of human voice made with this technology can nowadays be reconstructed and played back. It wouldn't be until 1878 when people could both record and play back sounds with Edison's improvement of phonatograph. A year later, in 1879, Edison also patented the light bulb, even though he didn't invent it -- there were at least 20 people who created a light bulb before him.

Around 1888 so called war of the currents was taking place; it was a heated battle between companies and inventors for whether the alternating or direct current would become the standard for distribution of electric energy. The main actors were Thomas Edison, a famous iventor and a huge capitalist dick rooting for DC, and George Westinghouse, the promoter of AC. Edison and his friends used false claims and even killing of animals to show that AC was wrong and dangerous, however AC was objectively better, e.g. by its efficiency thanks to using high voltage, and so it ended up winning the war. AC was also supported by the famous genius inventor Nikola Tesla who during these times contributed hugely to electric engineering, he e.g. invented an AC motor and Tesla coil and created a system for wireless transmission of electric power.

Also in 1888 probably the first video that survived until today was recorded by Lou Le Prince in Northern England, with a single lens camera. It is a nearly 2 second silent black and white shot of people walking in a garden.

1895 can roughly be seen as the year of invention of radio, specifically wireless telegraph, by Italian engineer and inventor Guglielmo Marconi. He built on top of work of others such as Hertz and Tesla and created a device with which he was able to wirelessly ring a bell at a distance over 2 km.

On December 17 1903 the Wright brothers famously performed the first controlled flight of a motor airplane which they built, in North Carolina. In repeated attempts they flew as far as 61 meters over just a few seconds.

Around 1915 Albert Einstein, a German physicist, completed his General Theory of Relativity, a groundbreaking physics theory that describes the fundamental nature of space and time and gives so far the best description of the Universe since Newton. This would shake the world of science as well as popular culture and would enable advanced technology including nuclear energy, space satellites, high speed computers and many others.

Int 1907 Lee De Forest invented a practically usable vacuum tube, an extremely important part usable in electric devices for example as an amplifier or a switch -- this would enable construction of radios, telephones and later even primitive computers. The invention would lead to the electronic revolution.

In 1924 about 50% of US households own a car.

October 22 1925 has seen the invention of transistor by Julius Lilienfeld (Austria-Hungary), a component that would replace vacuum tubes thanks to its better properties, and which would become probably the most essential part of computers. At the time the invention didn't see much attention, it would only become relevant decades later.

In 1931 Kurt Gödel, a genius mathematician and logician from Austria-Hunagry (nowadays Czech Republic), published revolutionary papers with his incompleteness theorems which proved that, simply put, mathematics has fundamental limits and "can't prove everything". This led to Alan Turing's publications in 1936 that nowadays stand as the foundations of computer science -- he introduced a theoretical computer called the Turing machine and with it he proved that computers, no matter how powerful, will never be able to "compute everything". Turing also predicted the importance of computers in the future and has created several algorithms for future computers (such as a chess playing program).

In 1938 Konrad Zuse, a German engineer, constructed Z1, the first working electric mechanical digital partially programmable computer in his parents' house. It weighted about a ton and wasn't very reliable, but brought huge innovation nevertheless. It was programmed with punched film tapes, however programming was limited, it was NOT Turing complete and there were only 8 instructions. Z1 ran on a frequency of 1 to 4 Hz and most operations took several clock cycles. It had a 16 word memory and worked with floating point numbers. The original computer was destroyed during the war but it was rebuilt and nowadays can be seen in a Berlin museum.

In hacker culture the period between 1943 (start of building of the ENIAC computer) to about 1955-1960 is known as the Stone Age of computers -- as the Jargon File puts it, the age when electromechanical dinosaurs ruled the Earth.

In 1945 the construction of the first electronic digital fully programmable computer was completed at University of Pennsylvania as the US Army project. It was named ENIAC (Electronic Numerical Integrator and Computer). It used 18000 vacuum tubes and 15000 relays, weighted 27 tons and ran on the frequency of 5 KHz. Punch cards were used to program the computer in its machine language; it was Turing complete, i.e. allowed using branches and loops. ENIAC worked with signed ten digit decimal numbers.

Among hackers the period between 1961 to 1971 is known as the Iron Age of computers. The period spans time since the first minicomputer (PDP1) to the first microprocessor (Intel 4004). This would be followed by so called elder days.

On July 20 1969 first men landed on the Moon (Neil Armstrong and Edwin Aldrin) during the USA Apollo 11 mission. This tremendous achievement is very much attributed to the cold war in which USA and Soviet Union raced in space exploration. The landing was achieved with the help of a relatively simple on-board computer: Apollo Guidance Computer clocked at 2 MHz, had 4 KiB of RAM and about 70 KB ROM. The assembly source code of its software is nowadays available online.

Shortly after, on 29 October 1969, another historical event would happen that could be seen as the start of perhaps the greatest technological revolution yet, the start of the Internet. The first letter, "L", was sent over a long distance via ARPANET, a new experimental computer packet switching network without a central node developed by US defense department (they intended to send "LOGIN" but the system crashed). The network would start to grow and gain new nodes, at first mostly universities. The network would become the Internet.

1st January 1970 is nowadays set as the start of the Unix epoch. It is the date from which Unix time is counted. During this time the Unix operating system, one of the most influential operating systems was being developed at Bell Labs, mainly by Ken Thompson and Dennis Ritchie. Along the way they developed the famous Unix philosophy and also the C programming language, perhaps the most influential programming language in history. Unix and C would shape the technology far into the future, a whole family of operating systems called Unix-like would be developed and regarded as the best operating systems thanks to their minimalist design.

By 1977 ARPANET had about 60 nodes.

August 12 1981 would see the released of IBM PC, a personal computer based on open, modular architecture that would immediately be very successful and would become the de-facto standard of personal computers. IBM PC was the first of the kind of desktop computers we have today. It had 4.77 MHz Intel 8088 CPU, 16 kB of RAM and used 5.25" floppy disks.

In 1983 Richard Stallman announced his GNU project and invented free (as in freedom) software, a kind of software that is freely shared and developed by the people so as to respect the users' freedom. This kind of ethical software stands opposed to the proprietary corporate software, it would lead to creation of some of the most important software and to a whole revolution in software development and its licensing, it would spark the creation of other movements striving for keeping ethics in the information age.

1985: on November 20 the first version of the Windows operating system was sadly released by Microsoft. These systems would become the mainstream desktop operating systems despite their horrible design and they would unfortunately establish so called Windows philosophy that would irreversibly corrupt other mainstream technology. Also in 1985 one of the deadliest software bugs appeared: that in Therac-25, a medical radiotherapy device which fatally overdosed several patients with radiation.

On April 26 1986 the Chernobyl nuclear disaster happened (the worst power plant accident in history) -- in north Ukraine (at the time under USSR) a nuclear power plant exploded, contaminated a huge area with radioactivity and released a toxic radioactive cloud that would spread over Europe -- many would die either directly or indirectly (many years later due to radioactivity poisoning, estimated at many thousands). The Chernobyl area would be sealed in the 30 km radius. It is estimated the area won't be habitable again for several thousands of years.

Around this time Internet is not yet mainstream but it is, along with similar local networks, working and has active communities -- there is no world wide web yet but people are using Usenet and BBSes for "online" discussions with complete strangers and developing early "online cultures".

At the beginning of 1991 Tim Berners-Lee created the World Wide Web, a network of interlinked pages on the Internet. This marks another huge step in the Internet revolution, the Web would become the primary Internet service and the greatest software platform for publishing any kind of information faster and cheaper than ever before. It is what would popularize the Internet and bring it to the masses.

Shortly after the Soviet Union dissolved and on 25 August 1991 Linus Torvalds announced Linux, his project for a completely free as in freedom Unix-like operating system kernel. Linux would become part of GNU and later one of the biggest and most successful software projects in history. It would end up powering Internet servers and supercomputers as well as desktop computers of a great number of users. Linux proved that free software works and surpasses proprietary systems.

After this very recent history follows, it's hard to judge which recent events will be of historical significance much later. 1990s have seen a huge growth of computer power, video games such as Doom led to development of GPUs and high quality computer graphics along with a wide adoption of computers by common people, which in turn helped the further growth of Internet. During the 90s we've also seen the rise of the open source movement. Shortly after 2000 Lawrence Lessig founded Creative Commons, an organization that came hand in hand with the free culture movement inspired by the free software movement. At this point over 50% of US households had a computer. Cell phones became a commonly owned item and after about 2005 so called "smart phones" and other "smart" devices replaced them as a universal communication device capable of connecting to the Internet. Before 2020 we've seen a huge advancement in neural network Artificial Intelligence which will likely be the topic of the future. Quantum computers are being highly researched with already existing primitive prototypes; this will also likely be very important in the following years. Besides AI there has appeared a great interest and development of virtual reality, drones, electromobiles, robotic Mars exploration and others. However the society and technology has generally seen a decadence after 2010, capitalism has pushed technology to become hostile and highly abusive to users, extreme bloat of technology causes highly inefficient, extremely expensive and unreliable technology. In addition society is dealing with a lot of serious issues such as the global warming and many people are foreseeing a collapse of society.

Recent History Of Technology

TODO: more detailed history since the start of Unix time


holy_war

Holy War

Holy war is a long passionate argument over a choice (often between two options) that touches an issue which is within given community deemed controversial and "religious", which to an outsider typically seems like a childish, hard to understand rant about an insignificant thing. In technology circles holy wars revolve e.g. around operating systems, programming languages, licenses, source code formatting etc. Such a war separates people into almost religious groups that sometimes argue to death about details such as what name something should be given, very much resembling traditional disagreements between religions and their churches. In holy wars people tend to defend whichever side they stand on to the death and can get emotional when discussing the topic. Some examples of holy wars are (in brackets indicated the side taken by LRS):

Things like cats vs dogs or sci-fi vs fantasy may or may not be a holy war, there is a bit of a doubt in the fact that one can easily like both and/or not be such a diehard fan of one or the other. A subject of holy war probably has to be something that doesn't allow too much of this.


how_to

How To

WELCOME TRAVELER

{ Don't hesitate to contact me. ~drummyfish }

Are you tired of bloat and can't stand shitty software like Windows anymore? Do you want to kill yourself? Do you hate capitalism? Do you also hate the fascist alternatives you're being offered? Do you just want to create a genuinely good bullshitless technology that would help all people? Do you just want to share knowledge freely without censorship? You have come to the right place.

Firstly let us welcome you, no matter who you are, no matter your political opinions, your past and your skills, color or shape of your genitalia, we are glad to have you here. Remember, you don't have to be a programmer to help and enjoy LRS. LRS is a lifestyle, a philosophy. Whether you are a programmer, artist, educator or just someone passing by, you are welcome, you may enjoy our culture and its fruit and if you want, you can help enrich it.

What This Article Is About

OK, let's say this is a set of general advice, life heuristics, pointers and basics of our philosophy, something to get you started, give you a point of view aligned with what we do, help you make a decision here and there, help you free yourself. Remember that by definition nothing we ever advice is a commandment or a rule you mustn't ever break, that would be wrong in itself. Some things also may be yet a "thought in progress" and change.

Moderacy (middle way) Vs Extremism

An important issue of many ideologies/philosophies/religions/etc. has shown to be striking the right balance between moderacy and extremism. Let's sum up the two stances:

Where does the balance lie? TBH this is a very hard question and we don't know the correct answer so far, perhaps there is no simple answer. Figuring this out may be one of the most difficult parts of our philosophy. The first good step is definitely to realize the issue, become aware of it, and start considering it in making one's important decisions. Choosing one or another should, as always, be done by ultimately aiming for our ideals, not for one's own benefit, though of course as any mere living being one will never be able to be completely objective and free himself from things such as fear and self-preservation instincts. If you make a bad decision, don't bash yourself, you are just mere mortal, acknowledge your mistake, forgive yourself and move on, there is no use in torturing yourself. One should perhaps not try to stick to either extremism and moderacy as a rule, but rather try to apply a differently balanced mix of both to any important decision that appears before him -- when unsure about the balance, a middle way between is probably safest, but when you strongly feel one way is morally more right, go for it.

Examples from LRS point of view:

Tech

Here are some extremely basic steps to take regarding technology and the technological aspect of LRS:

Would you like to create LRS but don't have enough spare time/money to make this possible? You can check out making living with LRS.

How To Make A Website

Making your own tiny independent website is pretty simple and a very good thing to do for being able to share opinions and files relatively freely -- using "social networks" for sharing non-mainstream stuff will not work as these get hardcore censored (yes, even the "FOSS" ones like Mastodon etc.). By making your own website you also help decentralize the web again, take a bit of control from the corporations, and you can greatly help others by sharing useful information with them.

Here we will quickly sum up how to make a static single page plain HTML website, which should suffice for most things (sharing opinions, contacts, files, multimedia, simple blogging, ...). Once you get more advanced you can do fancy stuff such as a multi-page wiki written in Markdown and compiled to HTML with a shell script, but that can wait for now.

Firstly do NOT follow mainstream tutorials on making website -- these are absolute horseshit and just follow ugly capitalist ways, you will just get brain cancer. Also do NOT use any frameworks; do NOT even use static site generators -- these are not needed at all! All you really need for making a small website is:

For starters try to go the easiest way: use some free static site hosting without a domain name. Later, once you get comfortable, you may transition to self-hosting with your custom domain.

Now you have to make the actual website in HTML. For that create a new file and name it index.html (the name has to be such as this is the default page name for websites). In it copy-paste the following:

<html>
<head>
</head>

<body>

<h1> My Awesome Website </h1>

</body>
</html>

This is really a bare-minimum testing website -- to expand it see the article on HTML.

Now you have to upload this html file to the hosting server -- check out the details of your hosting server on how to do this (you may e.g. need to use git or ftp to upload the file). And that's basically it, the rest is just expanding your site, making scripts to automatize uploading etc.

How To Live, Dos and Don'ts

This is a summary of some main guidelines on how an LRS supporter should behave in general so as to stay consistent with LRS philosophy, however it is important that this is shouldn't be taken as rules to be blindly followed -- the last thing we want is a religion of brainwashed NPCs who blindly follow orders. One has to understand why these principles are in place and even potentially modify them.


hw

Hardware

Hardware (HW), as opposed to software, are the physical parts of a computer, i.e. the circuits, the mouse, keyboard, the printer etc. Anything you can smash when the machine pisses you off.


hyperoperation

Hyperoperation

WARNING: brain exploding article

UNDER CONSTRUCTION

{ This article contains unoriginal research with errors and TODOs, read at own risk. Some really interesting and more in-dept information can be found at this nice site: http://mrob.com/pub/math/largenum.html. ~drummyfish }

Hyperoperations are mathematical operations that are generalizations/continuations of the basic arithmetic operations of addition, multiplication, exponentiation etc. Basically they're like the basic operations like plus but on steroids. When we realize that multiplication is just repeated addition and exponentiation is just repeated multiplication, it is possible to continue in the same spirit and keep inventing new operations by simply saying that a new operation means repeating the previously defined operation, so we define repeated exponentiation, which we call tetration, then we define repeated tetration, which we call pentation, etc.

There are infinitely many hyperoperations as we can go on and on in defining new operations, however we start with what seems to be the simplest operation we can think of: the successor operation (we may call it succ, +1, ++, next, increment, zeration or similarly). In the context of hyperoperations we call this operation hyper0. Successor is a unary operator, i.e. it takes just one number and returns the number immediately after it (suppose we're working with natural numbers). In this successor is a bit special because all the higher operations we are going to define will be binary (taking two numbers). After successor we define the next operation, addition (hyper1), or a + b, as repeatedly applying the successor operation b times on number a. After this we define multiplication (hyper2), or a * b, as a chain of b numbers as which we add together. Similarly we then define exponentiation (hyper3, or raising a to the power of b). Next we define tetration (hyper4, building so called power towers), pentation (hyper5), hexation (hyper6) and so on (heptation, octation, ...).

Indeed the numbers obtained by high order hyperoperations grow quickly as fuck.

An important note is this: there are multiple ways to define the hyperoperations, the most common one seems to be by supposing the right associative evaluation, which is what we're going to implicitly consider from now on. This means that once associativity starts to matter, we will be evaluating the expression chains FROM RIGHT, which may give different results than evaluating them from left (consider e.g. 2^(2^3) != (2^2)^3). The names tetration, pentation etc. are reserved for right associativity operations.

The following is a sum-up of the basic hyperoperations as they are commonly defined (note that many different symbols are used for these operations throughout literature, often e.g. up arrows are used to denote them):

operation symbol meaning commutative associative
successor (hyper0) succ(a) next after a
addition (hyper1) a + b succ(succ(succ(...a...))), b succs yes yes
multiplication (hyper2) a * b 0 + (a + a + a + ...), b as in brackets yes yes
exponentiation (hyper3) a ^ b 1 * (a * a * a * ...), b as in brackets no no
tetration (hyper4) a ^^ b 1 * (a ^ (a ^ (a ^ (...), b as in brackets no no
pentation (hyper5) a ^^^ b 1 * (a^^ (a^^ (a^^ (...), b as in brackets no no
hexation (hyper6) a ^^^^ b 1 * (a^^^(a^^^(a^^^(...), b as in brackets no no
... no more no more

The following ASCII masterpiece shows the number 2 in the territory of these hyperoperations:

{ When performing these calculations, use some special calculator that allows extremely high numbers such as HyperCalc (http://mrob.com/pub/comp/hypercalc/hypercalc-javascript.html) or Wolfram Alpha. ~drummyfish }

 2    +1    +1    +1    +1    +1    +1    +1  ...     successor
 |        __/   ________/           /       9
 |       /     /     ______________/
 |      /     /     /
 2  +  2  +  2  +  2  +  2  +  2  +  2  +  2  ...     addition
 |     |4       __/                       / 16
 |     |       /     ____________________/
 |     |      /     /
 2  *  2  *  2  *  2  *  2  *  2  *  2  *  2  ...     multiplication
 |     |4     8 __/ 16    32    64    128   256           
 |     |       /     
 |     |      /                 ~10^(6 * 10^19728)
 2  ^ (2  ^ (2  ^ (2  ^ (2  ^ (2  ^ (2  ^ (2  ...     exponentiation
 |     |4     16__/ 65536 ~10^19728   ~10^(10^(10^19728))
 |     |       /             not sure about arrows here, numbers get too big, TODO
 |     |      /
 2  ^^(2  ^^(2  ^^(2  ^^(2  ^^(2  ^^(2  ^^(2  ...     tetration
 |     |4    |65536
 |     |     |         not sure about arrows here either
 |     |     |
 2 ^^^(2 ^^^(2 ^^^(2 ^^^(2 ^^^(2 ^^^(2 ^^^(2  ...     pentation
 ...    4     65536                         a lot

Some things generally hold about hyperoperations, for example for any operation f = hyperN where N >= 3 and any number x it is true that f(1,x) = 1 (just as raising 1 to anything gives 1).

Hyperroot is the generalization of square root, i.e. for example for tetration the nth hyperroot of number a is such number x that tetration(x,n) = a.

Left associativity hyperoperations: Alternatively left association can be considered for defining hyperoperations which gives different operations. However this is usually not considered because, as mentioned in the webpage above, e.g. left association tetration a ^^ b can be simplified to a ^ (a ^ (b - 1)) and so it isn't really a new operation. Anyway, here is the same picture as above, but for left associativity -- we see the numbers don't grow THAT quickly (but still pretty quickly).

 2    +1    +1    +1    +1    +1    +1    +1  ...     successor
 |        __/   ________/           /       9
 |       /     /     ______________/
 |      /     /     /
 2  +  2  +  2  +  2  +  2  +  2  +  2  +  2  ...     addition
 |     |4       __/                       / 16
 |     |       /     ____________________/
 |     |      /     /
 2  *  2  *  2  *  2  *  2  *  2  *  2  *  2  ...     multiplication
 |     |4       __/ 16    32    64    128 / 256           
 |     |       /     ____________________/
 |     |      /     /
(2  ^  2) ^  2) ^  2) ^  2) ^  2) ^  2) ^  2  ...     left exponentiation
 |     |4     16__/ 256   65536             ~3*10^38
 |     |       /     ____________________________
 |     |      /     /
(2  ^^ 2) ^^ 2) ^^ 2) ^^ 2) ^^ 2) ^^ 2) ^^ 2  ...     left tetration
 |     |4     256   2^1048576   
 |     |                        TODO: arrows?
 |     |
(2 ^^^ 2)^^^ 2)^^^ 2)^^^ 2)^^^ 2)^^^ 2)^^^ 2  ...     left pentation
 ...    4     ~3*10^38

In fact we may choose to randomly combine left and right associativity to get all kinds of weird hyperoperations. For example we may define tetration with right associativity but then use left associativity for the next operation above it (we could call it e.g. "right-left pentation"), so in fact we get a binary tree of hyperoperations here (as shown by M. Muller in his paper on this topic).

Of course, we can now go further and start inventing things such as hyperlogarithms, hyperfactorials etc.

Code

Here's a C implementation of some hyperoperations including a general hyperN operation and an option to set left or right associativity (however note that even with 64 bit ints numbers overflow very quickly here):

#include <stdio.h>
#include <inttypes.h>
#include <stdint.h>

#define ASSOC_R 1 // right associativity?

// hyper0
uint64_t succ(uint64_t a)
{
  return a + 1;
}

// hyper1
uint64_t add(uint64_t a, uint64_t b)
{
  for (uint64_t i = 0; i < b; ++i)
    a = succ(a);

  return a;
  // return a + b
}

// hyper2
uint64_t multiply(uint64_t a, uint64_t b)
{
  uint64_t result = 0;

  for (uint64_t i = 0; i < b; ++i)
    result += a;

  return result;
  // return a * b
}

// hyper(n + 1) for n > 2
uint64_t nextOperation(uint64_t a, uint64_t b, uint64_t (*operation)(uint64_t,uint64_t))
{
  if (b == 0)
    return 1;

  uint64_t result = a;

  for (uint64_t i = 0; i < b - 1; ++i)
    result = 
#if ASSOC_R
      operation(a,result);
#else
      operation(result,a);
#endif

  return result;
}

// hyper3
uint64_t exponentiate(uint64_t a, uint64_t b)
{
  return nextOperation(a,b,multiply);
}

// hyper4
uint64_t tetrate(uint64_t a, uint64_t b)
{
  return nextOperation(a,b,exponentiate);
}

// hyper5
uint64_t pentate(uint64_t a, uint64_t b)
{
  return nextOperation(a,b,tetrate);
}

// hyper6
uint64_t hexate(uint64_t a, uint64_t b)
{
  return nextOperation(a,b,pentate);
}

// hyper(n)
uint64_t hyperN(uint64_t a, uint64_t b, uint8_t n)
{
  switch (n)
  {
    case 0: return succ(a); break;
    case 1: return add(a,b); break;
    case 2: return multiply(a,b); break;
    case 3: return exponentiate(a,b); break;
    default: break;
  }

  if (b == 0)
    return 1;

  uint64_t result = a;

  for (uint64_t i = 0; i < b - 1; ++i)
    result = hyperN(
#if ASSOC_R
      a,result
#else
      result,a
#endif
      ,n - 1);

  return result;
}

int main(void)
{
  printf("\t0\t1\t2\t3\n");

  for (uint64_t b = 0; b < 4; ++b)
  {
    printf("%" PRIu64 "\t",b);

    for (uint64_t a = 0; a < 4; ++a)
      printf("%" PRIu64 "\t",tetrate(a,b));

    printf("\n");
  }

  return 0;
}

In this form the code prints a table for right associativity tetration:

        0       1       2       3
0       1       1       1       1
1       0       1       2       3
2       1       1       4       27
3       0       1       16      7625597484987

implicit

Implicit

Implicit means something that's assumed unless stated otherwise; it is the opposite of explicit. For example many floating point formats assume an implicit (not physically stored) bit with value 1 prepended to the explicitly stored mantissa values. Though not the same, the term implicit is similar to default; for example an implicit/default background color of some image format may be defined as white, meaning that unless background color is stated, we suppose the background to be white (though a default value may still be explicitly stored; default just means an initial, unchanged value). Implicit values may be important e.g. for saving space -- imagine we have some dataset in which 90% of values are zero; then it is convenient to state zero to be the implicit value and not store such values, by which we'll save 90% of space.


infinity

Infinity

Infinity (from Latin in and finis, without end) is a quantity so unimaginably large that it has no end. It plays a prominent role especially in mathematics and philosophy. As a "largest imaginable quantity" it is sometimes seen to be the opposite of the number zero, the "smallest possible quantity", though other "opposites" can be though of too, such as minus infinity or an infinitely small non-zero number (infinitesimal). The symbol for infinity is lemniscate, the symbol 8 turned 90 degrees (unicode U+221E). Keep in mind that mere lack of boundaries doesn't imply infinity -- a circle has no end but is not infinite; an infinity implies there is always more, no matter how much we get.

The concept of infinity came to firstly be explored by philosophers -- as an abstract concept (similar to those of e.g. zero or negative numbers) it took a while for it to evolve, be explored and accepted. We can't say who first "discovered" infinity, civilizations often had concepts similar to it that were connected for example to their gods. Zeno of Elea (5th century BC) was one of the earliest to tackle the issue of infinity mathematically by proposing paradoxes such as that of Achilles and the tortoise.

The term infinity has two slightly distinct meanings:

It could be argued that potential infinity is really the reason for the existence of true, high level mathematics as we know it, as that is concerned with constructing mathematical proofs -- such proofs are needed anywhere where there exist infinitely many possibilities, as if there was only a finite number of possibilities, we could simply enumerate and check them all without much thinking (e.g. with the help of a computer). For example to confirm Fermat's Last Theorem ("for whole numbers and n > 2 the equation a^n + b^n = c^n doesn't have a solution") we need a logical proof because there are infinitely many numbers; if there were only finitely many numbers, we could simply check them all and see if the theorem holds. So infinity, in a sense, is really what forces mathematicians to think.

Is infinity a number? Usually no, but it depends on the context. Infinity is not a real number (which we usually understand by the term "number") because that would break the nice field structure of real numbers, so the safe implicit answer to the question is no, infinity is not a traditional number, it is rather a concept closely related to numbers. However infinity may many times behave like a number and we may want to treat it so -- for example the result of computing a limit may be a real number but also infinity; so ultimately everything depends on our definition of what number is and we can declare infinity to be a number in some systems, for example there exists so called extended real number line which consists of real numbers and plus/minus infinity, which ARE treated as numbers.

An important term related to the term infinite is infinitesimal, or infinitely small, a concept very important e.g. for calculus. While the "traditional" concept of infinity looks beyond the greatest numbers imaginable, the concept of infinitely small is about being able to divide (or "zoom in", see also fractals) without end; for example in the realm of real numbers we may start at number 1 and keep moving closer and closer towards zero without ever reaching the "smallest nonzero number", as no matter how close to zero we are, we may always divide our distance by two. A term also related to this is limit, which helps us explore values "infinitely close", "infinitely far" etc.

When treated as cardinality (i.e. size of a set), we conclude that there are many infinities, some larger than others, for example there are infinitely many rational numbers and infinitely many real numbers, but in a sense there are more real numbers than rational ones -- this is very counter intuitive, but nevertheless was proven by Georg Cantor in 1874. He showed that it is possible to create a 1 to 1 pairing of natural numbers and rational numbers and so that these sets are of the same size -- he called this kind of infinity countable -- then he showed it is not possible to make such pairing with real numbers and so that there are more real numbers than rational ones -- he called this kind of infinity uncountable. Furthermore this hierarchy of "larger and larger infinities" goes on forever, as for any set we can always create a set with larger cardinality e.g. by taking its power set (a set of all subsets).

In regards to programming: programmers are often just engineers and so simplify the subject of infinity in a way which to a mathematician would seem unacceptable. For example it is often a good enough approximation of infinity to just use an extremely large number value, e.g. the largest one storable in given data type, which of course has its limitations, but in practice just werks (just watch out for overflows). Programmers also often resort to breaking the mathematical rules, e.g. they may accept that x / 0 = infinity, infinity + infinity = infinity etc. Systems based on symbolic computation may be able to handle infinity with exact mathematical precision. Advanced data types, such as floating point, often have a special value for infinity -- IEEE 754 floating point, for example, is capable of representing positive and negative infinity.

WATCH OUT: infinite universe doesn't imply existence of everything -- this is a common fallacy to think it does. For example people tend to think that since the decimal expansion of the digits of pi is infinite and basically "random", there should always exist any finite string of digits somewhere in it; this doesn't follow from the mere fact that the series is infinite (though the conclusion MAY or may not be true, we don't actually know this about pi yet). Imagine for example the infinite series of even numbers -- there are infinitely many numbers in it, but you will never find any odd number there.


information

Information

Information wants to be free.

Information (from Latin informare: shape/describe/represent) is knowledge that can be used for making decisions. Information is interpreted data, i.e. while data itself may not give us any information, e.g. if they're encrypted and we don't know the key or if we simply don't know what the data signifies or implies, information emerges once we make sense of the data. Information is contained e.g in books, on the Internet, in nature, and we access it through our senses. Computers can be seen as machines for processing information and since the computer revolution information has become the focus of our society; we often encounter terms such as information technology, informatics, information war, information age etc. Information theory is a scientific field studying information.

Information wants to be free, i.e. it is free naturally unless we decide to limit its spread with shit like intellectual property laws. What does "free" mean? It is the miraculous property of information that allows us to duplicate it basically without any cost. Once we have certain information, we may share it with others without having to give up our own knowledge of the information. A file on a computer can be copied to another computer without deleting the file on the original computer. This is unlike with physical products which if we give to someone, we lose them ourselves. Imagine if you could make a piece of bread and then duplicate it infinitely for the whole world -- information works like this! We see it as a crime to want to restrict such a miracle. We may also very nicely store information in our heads. For all this information is beautiful. It is sometimes discussed whether information is created or discovered -- if a mathematician comes up with an equation, is it his creation or simply his discovery of something that belongs to the nature and that has always been there? This question isn't so important because whatever terms we use, we at LRS decide to create, spread and freely share information without limiting it in any way, i.e. neither discovery nor invention should give rise to any kind of property.

In computer science the basic unit of information amount is 1 bit (for binary digit), also known as shannon. It represents a choice of two possible options, for example an answer to a yes/no question (with each answer being equally likely), or one of two binary digits: 0 or 1. From this we derive higher units such as bytes (8 bits), kilobytes (1000 bytes) etc. Other units of information include nat or hart. With enough bits we can encode any information including text, sounds and images. For this we invent various formats and encodings with different properties: some encodings may for example contain redundancy to ensure the encoded information is preserved even if the data is partially lost. Some encodings may try to hide the contained information (see encryption, obfuscation, steganography). For processing information we create algorithms which we usually execute with computers. We may store information (contained in data) in physical media such as books, computer memory or computer storage media such as CDs, or even with traditional potentially analog media such as photographs.

Keep in mind that the amount of physically present bits doesn't have to equal the amount of information because, as mentioned above, data that takes N bits may e.g. utilize redundancy and so store less information that would theoretically be possible with N bits. It may happen that the stored bits are correlated for any reason or different binary values convey the same information (e.g. in some number encodings there are two values for number zero: positive and negative). All this means that the amount of information we receive in N bit data may be lower (but never higher) than N bits, i.e. if we e.g. store a file on a 1 GB flash drive, the actual theoretical information contained may be lower -- the exact size of such theoretical information depends on probabilities of what can really appear in the file and MAY CHANGE with the knowledge we posses, i.e. the amount of information stored on the flash drive may change by simply us coming to know that the file stored on the drive is a movie about cats which rules out many combinations of bits that can be stored there. Imagine a simplified case when there is file which says whether there exists infinitely many prime numbers -- to a mathematician who already knows the answer the file gives zero information, while to someone who doesn't know the answer the file provides 1 bit of information. However in practice we often make the simplification of equating the amount of physically present bits to the contained "information".

Information is related to information entropy (also Shannon entropy, similar to but distinct from the concept of thermodynamic entropy in physics); they're both measured in same units (usually bits) but entropy measures a kind of "uncertainty" or average information received from a certain event when we know its probability distribution -- in a sense information and entropy can be seen as opposites: before we receive information we lack the information but there exists entropy, once we receive the information there is information but no entropy.

In signal theory information is also often used as a synonym for signal, however a distinction can be made: signal is the function that carries information. Here we also encounter the term noise which means an unwanted signal mixed in with the desired signal which may make it harder to extract the information carried by the signal, or even obscure some or all of the information so that it can't be retrieved.

According to the theory of relativity information can never travel faster than light -- even if some things may move faster than light, such as a shadow, so called "spooky action at a distance" (usually associated with quantum entanglement) or even matter due to the expansion of space, by our best knowledge we can never use this to transfer information faster than light. For this it seems our communication technology will always be burdened by lag, no matter how sophisticated.


intellectual_property

"Intellectual Property"

"Intellectual property" (IP, not to be confused with IP address) is a toxic capitalist idea that says that people should be able to own information (such as ideas, presentation style, songs or text) and that it should be treated in ways very similar to physical property. For example patents are one type of intellectual property which allow an inventor of some idea to own that idea and be able to limit its use and charge money to people using that idea, or prevent people from using that idea altogether. Copyright is probably the most harmful of IP today, and along with patents the most relevant one in the area of technology. However, IP encompasses many other subtypes of this kind of "property" such as trademarks, trade dress, plant varieties etc. IP is an arbitrarily invented grant of monopoly on information, i.e. something that is otherwise naturally free.

Most people with brain oppose this idea, see e.g. http://harmful.cat-v.org/economics/intellectual_property/.

IP exists to benefit corporations, it artificially limits the natural freedom of information (see artificial scarcity) and tries to eliminate freedom and competition, it fuels consumerism (for example a company can force deletion of old version of its program in order to force users to buy the new version), it helps keep malicious features in programs (by forbidding any study and modifications) and forces reinventing wheels which is extremely energy and resource wasting. Without IP, everyone would be able to study, share, improve and remix and combine existing technology and art.

Many people protest against the idea of IP -- either wanting to abandon the idea completely, as we do, or at least arguing for great relaxation the insanely strict and aggressive forms that destroy our society. Movements such as free software and free culture have come into existence in protest of IP laws. Of course, capitalists don't give a shit. It can be expected the IP cancer will be reaching even more extreme forms very soon, for example it will be perpetual and encompassing such things as mere though (thoughts will be monitored and people will be charged for thinking about ideas owned by corporations).

It must be noted that as of 2020 it is not possible to avoid the IP shenanigans. Even though we can eliminate most of the harmful stuff (for now) with licenses and waivers, there are many things that may be impossible to address or posing considerable dangers, e.g. trademark, personal rights or patent troll attacks. In some countries (US) it is illegal to make free programs that try to circumvent DRM. Some countries make it explicitly impossible to e.g. waive copyright. It is impossible to safely check whether your creation violates on someone else's IP. There exists shit such as moral rights that may exist even if copyright doesn't apply.


interesting

Interesting

This is a great answer to anything, if someone tells you something you don't understand or something you think is shit and you don't know what to say, you just say "interesting".

All natural numbers are interesting: there is a fun proof by contradiction of this. Suppose there exists a set of uninteresting numbers which is a subset of natural numbers; then the smallest of these numbers is interesting by being the smallest uninteresting number -- we've arrived at contradiction, therefore a set of uninteresting numbers cannot exist.

TODO: just list some interesting shit here


internet

Internet

Internet is the grand, decentralized global network of interconnected computer networks that allows advanced, cheap, practically instantaneous intercommunication of people and computers and sharing of large amounts of data and information. Over just a few decades since its birth in 1970s it changed the society tremendously, shifted it to the information age and stands as possibly the greatest technological invention of our society. It is a platform for many services and applications such as the web, e-mail, internet of things, torrents, phone calls, video streaming, multiplayer games etc. Of course, once Internet became accessible to normal people and has become the largest public forum on the planet, it has also become the biggest dump of retards in history.

Internet is built on top of protocols (such as IP, HTTP or SMTP), standards, organizations (such as ICANN, IANA or W3C) and infrastructure (undersea cables, satellites, routers, ...) that all together work to create a great network based on packet switching, i.e. a method of transferring digital data by breaking them down into small packets which independently travel to their destination (contrast this to circuit switching). The key feature of the Internet is its decentralization, i.e. the attribute of having no central node or authority so that it cannot easily be destroyed or taken control over -- this is by design, the Internet evolved from ARPANET, a project of the US defense department. Nevertheless there are parties constantly trying to seize at least partial control of the Internet such as governments (e.g. China and its Great Firewall, EU with its "anti-pedophile" chat monitoring laws etc.) and corporations (by creating centralized services such as social networks). Some are warning of possible de-globalization of the Internet that some parties are trying to carry out, which would turn the Internet into so called splinternet.

Access to the Internet is offered by ISPs (internet service providers) but it's pretty easy to connect to the Internet even for free, e.g. via free wifis in public places, or in libraries. By 2020 more than half of world's population had access to the Internet -- most people in the first world have practically constant, unlimited access to it via their smartphones, and even in poor countries capitalism makes these devices along with Internet access cheap as people constantly carrying around devices that display ads and spy on them is what allows their easy exploitation.

The following are some stats about the Internet as of 2022: there are over 5 billion users world-wide (more than half of them from Asia and mostly young people) and over 50 billion individual devices connected, about 2 billion websites (over 60% in English) on the web, hundreds of billions of emails are sent every day, average connection speed is 24 Mbps, there are over 370 million registered domain names (most popular TLD is .com), Google performs about 7 billion web searches daily (over 90% of all search engines).

History

see also history

TODO (see https://www.zakon.org/robert/internet/timeline/)

See Also


interplanetary_internet

Interplanetary Internet

Interplanetary Internet is at this time still a hypothetical extension of the Internet to multiple planets. As mankind is getting closer to starting living on other planets and bodies such as Mars and Moon, we have to start thinking about the challenges of creating a communication network between all of them. The greatest challenge is posed by the vast distances that increase the communication delay (which arises due to the limited speed of light) and make errors such as packet loss much more painful. Two-way communication (i.e. request-response) to Moon and Mars can take even 2 seconds and 40 minutes respectively. Also things like planet motions, eclipses etc. pose problems to solve.

We can see that e.g. real time Earth-Mars communication (e.g. chat or videocalls) are physically impossible, so not only do we have to create new network protocols that minimize the there-and-back communication (things such as handshakes are out of question) and implement great redundancy for reliable recovery from loss of data traveling through space, we also need to design new user interfaces and communication paradigms, i.e. we probably need to create a new messaging software for "interplanetary chat" that will for example show the earliest time at which the sender can expect an answer etc. Interesting shit to think about.

{ TFW no Xonotic deathmatches with our Moon friends :( ~drummyfish }

For things like Web, each planet would likely want to have its own "subweb" (distinguished e.g. by TLDs) and caches of other planets' webs for quick access. This way a man on Mars wouldn't have to wait 40 minutes for downloading a webpage from the Earh web but could immediately access that webpage's slightly delayed version, which is of course much better.

Research into this has already been ongoing for some time. InterPlaNet is a protocol developed by NASA and others to be the basis for interplanetary Internet.

See Also


interpolation

Interpolation

Interpolation (inter = between, polio= polish) means computing (usually a gradual) transition between some specified values, i.e. creating additional intermediate points between some already existing points. For example if we want to change a screen pixel from one color to another in a gradual manner, we use some interpolation method to compute a number of intermediate colors which we then display in rapid succession; we say we interpolate between the two colors. Interpolation is a very basic mathematical tool that's commonly encountered almost everywhere, not just in programming: some uses include drawing a graph between measured data points, estimating function values in unknown regions, creating smooth animations, drawing vector curves, digital to analog conversion, enlarging pictures, blending transition in videos and so on. Interpolation can be used to generalize, e.g. if we have a mathematical function that's only defined for whole numbers (such as factorial or Fibonacci sequence), we may use interpolation to extend that function to all real numbers. Interpolation can also be used as a method of approximation (consider e.g. a game that runs at 60 FPS to look smooth but internally only computes its physics at 30 FPS and interpolates every other frame so as to increase performance). All in all interpolation is one of the most important things to learn.

The opposite of interpolation is extrapolation, an operation that's extending, creating points OUTSIDE given interval (while interpolation creates points INSIDE the interval). Both interpolation and extrapolation are similar to regression which tries to find a function of specified form that best fits given data (unlike interpolation it usually isn't required to hit the data points exactly but rather e.g. minimize some kind of distance to these points).

There are many methods of interpolation which differ in aspects such as complexity, number of dimensions, type and properties of the mathematical curve/surface (polynomial degree, continuity/smoothness of derivatives, ...) or number of points required for the computation (some methods require knowledge of more than two points).


      .----B           _B          _.B        _-'''B-.
      |              .'          .'         .'
      |           _-'           /          :
      |         .'            .'          /
 A----'       A'           A-'        _.A'

  nearest       linear       cosine         cubic

A few common 1D interpolation methods.

The base case of interpolation takes place in one dimension (imagine e.g. interpolating sound volume, a single number parameter). Here interpolation can be seen as a function that takes as its parameters the two values to interpolate between, A an B, and an interpolation parameter t, which takes the value from 0 to 1 -- this parameter says the percentage position between the two values, i.e. for t = 0 the function returns A, for t = 1 it returns B and for other values of t it returns some intermediate value (note that this value may in certain cases be outside the A-B interval, e.g. with cubic interpolation). The function can optionally take additional parameters, e.g. cubic interpolation requires to also specify slopes at the points A and B. So the function signature in C may look e.g. as

float interpolate(float a, float b, float t);

Many times we apply our interpolation not just to two points but to many points, by segments, i.e. we apply the interpolation between each two neighboring points (a segment) in a series of many points to create a longer curve through all the points. Here we are usually interested in how the segments transition into each other, i.e. what the whole curve looks like at the locations of the points.

Nearest neighbor is probably the simplest interpolation (so simple that it's sometimes not even called an interpolation, even though it technically is). This method simply returns the closest value, i.e. either A (for t < 0.5) or B (otherwise). This creates kind of sharp steps between the points, the function is not continuous, i.e. the transition between the points is not gradual but simply jumps from one value to the other at one point.

Linear interpolation (so called lerp) is probably the second simplest interpolation which steps from the first point towards the second in a constant step, creating a straight line between them. This is simple and good enough for many things, the function is continuous but not smooth, i.e. there are no "jumps" but there may be "sharp turns" at the points, the curve may look like a "saw".

Cosine interpolation uses part of the cosine function to create a continuous and smooth line between the points. The advantage over linear interpolation is the smoothness, i.e. there aren't "sharp turns" at the points, just as with the more advanced cubic interpolation against which cosine interpolation has the advantage of still requiring only the two interval points (A and B), however for the price of a disadvantage of always having the same horizontal slope at each point which may look weird in some situations (e.g. multiple points lying on the same sloped line will result in a curve that looks like smooth steps).

Cubic interpolation can be considered a bit more advanced, it uses a polynomial of degree 3 and creates a nice smooth curve through multiple points but requires knowledge of one additional point on each side of the interpolated interval (this may create slight issues with the first and last point of the sequence of values). This is so as to know at what slope to approach an endpoint so as to continue in the direction of the point behind it.

The above mentioned methods can be generalized to more dimensions (the number of dimensions are equal to the number of interpolation parameters) -- we encounter this a lot e.g. in computer graphics when upscaling textures (sometimes called texture filtering). 2D nearest neighbor interpolation creates "blocky" images in which pixels simply "get bigger" but stay sharp squares if we upscale the texture. Linear interpolation in 2D is called bilinear interpolation and is visually much better than nearest neighbor, bicubic interpolation is a generalization of cubic interpolation to 2D and is yet smoother that bilinear interpolation.

TODO: simple C code pls, maybe linear interpolation without floats

See Also


ioccc

International Obfuscated C Code Contest

The International Obfuscated C Code Contest (IOCCC for short) is an annual online contest in making the most creatively obfuscated programs in C. It's kind of a "just for fun" thing but similarly to esoteric languages there's an element of art and clever hacking that carries a great value. While the productivity freaks will argue this is just a waste of time, the true programmer appreciates the depth of knowledge and creative thinking needed to develop a beautifully obfuscated program. The contest runs since 1984 and was started by Landon Curt Noll and Larry Bassel.

Unfortunately some shit is flying around IOCCC too, for example confusing licensing -- having a CC-BY-SA license in website footer and explicitly prohibiting commercial use in the text, WTF? Also the team started to use Microshit's GitHub. They also allow latest capitalist C standards, but hey, this is a contest focused on ugly C, so perhaps it makes sense.

Hacking the rules of the contest is also encouraged and there is an extra award for "worst abuse of the rules".

Some common ideas employed in the programs include:

And let us also mention a few winning entries:


io

Input/Output

In programming input/output (I/O or just IO) refers to communication of a computer program with the outside environment, for example with the user in real world or with the operating system. Input is information the program gets from the outside, output is information the program sends to the outside. I/O is a basic and very important term as it separates any program to two distinct parts: the pure computational system (computation happening "inside") and I/O which interconnects this system with the real world and hence makes it useful -- without I/O a program would be practically useless as it couldn't get any information about the real world and it couldn't present computed results. In hardware there exists the term "I/O device", based on the same idea -- I/O devices serve to feed input into and/or get output from a physical computer, for example keyboard is an input device and monitor is an output device (a computer without I/O devices would be useless just as a program without I/O operations).

Note that I/O is not just about communication with a human user, it also means e.g. communication over network, reading/writing from/to files etc.

It is possible to have no input (e.g. a demo), but having no output at all probably makes no sense (see also write-only).

I/O presents a challenge for portability! While the "pure computation" part of a program may be written in a pure platform-independent language such as C (and can therefore easily be compiled on different computers), the I/O part of the program usually requires some platform specific library or a library with many dependencies; for example to display pictures on screen one may use SDL, OpenGL, Linux framebuffer, CSFML, X11, Wayland and many other libraries, each one handling I/O a bit differently. Whatever library you choose, it may be unavailable on some other platform, so the program won't run there. Some hardware platforms (e.g. many game consoles) even have their own exclusive I/O library, use of which will just tie the program to that single platform. There are programming languages and libraries that try to provide platform-independent I/O, but such approach is limited as it has to suppose some common features that will be available everywhere; for example C has a standard platform-independent I/O library stdio, but it only allows text input/output, for anything advanced such as graphics, sound and mouse one has to choose some 3rd party library. Unix philosophy also advises to only use text I/O if possible, so as to "standardize" and tame I/O a bit, but then again one has to choose what communication protocol/format to use etc. So generally I/O is a problem we have to deal with.

How to solve this? By separating I/O code from the "pure computation" code, and by minimizing and abstracting the I/O code so that it is easily replaceable. Inexperienced programmers often make the mistake of mixing the pure computation code with I/O code -- it is then very difficult to replace such I/O code with different I/O code on a different platform. See portability for more detail.

I/O also poses problems in some programming paradigms, e.g. in functional programming.

TODO: code example


island

Welcome to the Island!

This is the freedom island where we live! Feel free to build your house on any free spot. Planting trees and making landscape works are allowed too.

                              ____
                          __X/    '-X_
    '-~-.           ____./  i   X     '-__       
               __.-'   /'  XX   i         \_      '-~-.
        ___,--' x  x_/'    Xi     O         '-_
    ___/       __-''   X  X(      i    x       '-._
 _-'                   i  i          [T]  xX  x    ''-._
(                          O      :      ixx            \
 '-                                \_                    )
   ''-__                             '.  ____      ____-'
        ''--___    [D]      ; x        \/    ''---' 
               ''--__         ;xX       \__
                     \          iX         ''-__
   '-~-.             /           i  O           '--__
                    |               i                \
           '-~-.     \__                              )
'-~-.                    ''--___                  ____/
                               ''--__________--''

D: drummyfish's house

T: The Temple, it has nice view of the sea and we go meditate here, it's a nice walk.

jargon_file

Jargon File

Jargon File (also Hacker's Dictionary) is a computer hacker dictionary/compendium that's been written and updated by a number of prominent hackers, such as Richard Stallman and Erik S Raymond, since 1970. It is a greatly important part of hacker culture and has also partly inspired this very wiki.

It informally states that it's in the public domain and some people have successfully published it commercially, however there is no standard waiver or license -- maybe because such waivers didn't really exist at the time it was started -- and so we have to suppose it is NOT formally free as in freedom. Nevertheless it is freely accessible e.g. at Project Gutenberg and no one will bother you if you share it around... we just wouldn't recommend treating it as true public domain.

It is pretty nicely written with great amount of humor and good old political incorrectness, you can e.g. find the definition of terms such as rape and clit mouse. Some other nice terms include smoke emitting diode (broken diode), notwork (non-functioning network), Internet Exploiter, binary four (giving a finger in binary), Kamikaze packet or Maggotbox (Macintosh). At the beginning the book gives some theory about how the hacker terms are formed (overgeneralization, comparatives etc.).


java

Java

Unfortunately 3 billion devices run Java.

Java (not to be confused with JavaScript) is a highly bloated, inefficient, "programming language" that's sadly kind of popular. It is compiled to bytecode and therefore "platform independent" (as long as the platform has a lot of resources to waste on running Java virtual machine). Some of the features of Java include bloat, slow loading, slow running, supporting capitalism, forced and unavoidable object obsession and the necessity to create a billion of files to write even a simple program.

Avoid this shit.

{ I've met retards who seriously think Java is more portable than C lol. I wanna suicide myself. ~drummyfish }


javascript

JavaScript

JavaScript (not to be confused with completely unrelated Java language) is a bloated programming language used mainly on the web.

TODO

"11" + 1  // "111"
"11" - 1  // 10

jesus

Jesus

Jesus Christ (also Jesus of Nazareth, about 4 BC to 33 AD) was a jewish carpenter preacher that was said to be the son of God, whose life along with supposed miracles he performed is described by the Bible (specifically its New Testament), and who is the center figure of Christianity, the world's largest religion; as such he is probably the most famous of all men in history (probably followed by Hitler, kind of his opposite). In fact we count our years more or less from his birth. He gained many followers as he preached that God has decided to change his laws a bit and accept all "well behaved" people into his heaven kingdom (i.e. not just jews as was the case until then). For causing a great social disturbance by this he was later crucified, as he himself predicted -- according to the Bible he sacrificed himself by this to redeem the sins of all people, was resurrected after death and came up to the heaven to dwell by the God's side. Without subscribing to any mass religion or even having to believe in god, our LRS is greatly aligned with much of the teaching of Jesus Christ, especially that of non violence, love of all people (even one's "enemies"), modesty, frugality etc.

As perhaps the most influential man in history whose image has been twisted, used and abused over the centuries, we have to nowadays distinguish two separate characters:

Jesus was anarchist, pacifist, communist and explicitly rejected capitalism, though stupid American capital and military worshippers tattoo shit such as "What would Jesus do?" on their asses, they somehow seem to masterfully apply selective blindness and completely ignore his quotes such as:

(Americans are stupid idiots who say they love Jesus but rather love to reference Old Testament for their pragmatics life decisions, however the law of Old Testament was explicitly cancelled by Jesus and updated to a new one, based on love and nonviolence rather than violence, punishment and revenge -- this is the whole point of why Jesus came to Earth in the first place. But as it's been said, Americans are stupid.)

fun facts about Jesus:

TODO: moar


john_carmack

John Carmack

John Carmack is a brilliant legendary programmer that's contributed mostly to computer graphics and stands behind engines of such games as Doom, Wolfenstein and Quake. He helped pioneer real-time 3D graphics, created many hacks and algorithms (e.g. the reverse shadow volume algorithm). He is also a rocket engineer.

  ____
 /_____\
 |O-O''@
 | _  |
 \____/

ASCII art of John Carmack

He's kind of the ridiculously stereotypical nerd with glasses that just from the way he talks gives out the impression of someone with high functioning autism. You can just sense his IQ is over 9000. Some nice shit about him can be read in the (sadly proprietary) book Masters of Doom.

Carmack is a proponent of FOSS and has released his old game engines as such which gave rise to an enormous amount of modifications, forked engines and even new games (e.g. Freedoom and Xonotic). He's probably leaning more towards the dark side of the source: the open-source. In 2021 Carmack tweeted that he would have rather licensed his old Id engines under a permissive BSD license than the GPL, which is good.

In 2013 he sadly sold his soul to Facebook to work on VR (in a Facebook owned company Oculus).


jokes

Jokes

Here you can shitpost your jokes that are somehow related to this wiki's topic. Just watch out for copyright (no copy-pasting jokes from other sites)!

Please do NOT post lame "big-bang-theory"/9gag jokes like sudo make sandwich or there are 10 types of people.

{ Many of the jokes are original, some are shamelessly pulled from other sites and reworded. I don't believe copyright can apply if the expression of a joke is different, ideas can't be copyrighted. Also the exact origins of jokes are difficult to track so it's probably a kind of folklore. ~drummyfish }

See Also


julia_set

Julia Set

TODO

 ___________________________________________________________________
| Julia Set for -0.34 - 0.63i       :.                              |
|                                ..':. ..                           |
|                                '':.:'''      ..       ..          |
|                                 :':::.. ''  ::.    .. :.'         |
|                                  '::::. :: :::. .   :.'': .       |
|                              ......':::.::.:: ...::.:::.::.. .    |
|                              :::::::::':'.':.::::::::':.::''::..  |
|                   .             '::::'':':::':'':::'  ::''  '     |
|                   ':.       .   .. ..::'::':::.   '   :'          |
|                 . :: :'     ::..::::::: ::: ':::..     '          |
|                   :'::::   '.:::::::::'.::::'  ''                 |
|                    .:::::' ':::::::::. ''::::'.                   |
|                  :. '::::'.::::::::::.  '::':.'                   |
|          . .   '':::. ::: ::::::::'::'    .::::                   |
|         :':.  ... ':::.:':::''  '  '        ''.                   |
|        ..::  .::::::...':.::::::.:                                |
|   :::...' '.::::::::'.: .:.:'::::'':                              |
|    '' :. : .:''':' :::'::':::.   ' '                              |
|         '::'': '' '::: ::'':::::                                  |
|          ::       ':.  '' '':::.:                                 |
|         ' '       '        ::.:.'.'                               |
|                              ::'                                  |
|                              '                                    |
|___________________________________________________________________|

Code

The following code is a simple C program that renders given Julia set into terminal (for demonstrative purposes, it isn't efficient or do any antialiasing).

#include <stdio.h>

#define ROWS 30
#define COLS 70
#define SET_X -0.36 // Julia set parameter
#define SET_Y -0.62 // Julia set parameter
#define FROM_X -1.5
#define FROM_Y 1.0
#define STEP (3.0 / ((double) COLS))

unsigned int julia(double x, double y)
{
  double cx = x, cy = y, tmp;

  for (int i = 0; i < 1000; ++i)
  {
    tmp = cx * cx - cy * cy + SET_X;
    cy = 2 * cx * cy + SET_Y;
    cx = tmp;

    if (cx * cx * cy * cy > 10000000000)
      return 0;
  }

  return 1;
}

int main(void)
{
  double cx, cy = FROM_Y;

  for (int y = 0; y < ROWS; ++y)
  {
    cx = FROM_X;

    for (int x = 0; x < COLS; ++x)
    {
      unsigned int point = 
        julia(cx,cy) + (julia(cx,cy + STEP) * 2);   

      putchar(point == 3 ? ':' : (point == 2 ? '\'' : 
        (point == 1 ? '.' : ' ')));

      cx += STEP;
    }

    putchar('\n');

    cy -= 2 * STEP;
  }

  return 0;
}

justice

Justice

Justice is an euphemism for revenge.


just_werks

Just Werks

"Just werks" (for "just works" if that's somehow not clear) is a phrase used by noobs to justify using a piece of technology while completely neglecting any other deeper and/or long term consequences. A noob doesn't think about technology further than how it can immediately perform some task for him.

This phrase is widely used on 4chan/g, it probably originated there.

The "just werks" philosophy completely ignores questions such as:

See Also


kek

Kek

Kek means lol. It comes from World of Warcraft where the two opposing factions (Horde and Alliance) were made to speak mutually unintelligibile languages so as to prevent enemy players from communicating; when someone from Horde typed "lol", an Alliance player would see him say "kek". The other way around (i.e. Alliance speaking to Horde) would render "lol" as "bur", however kek became the popular one. On the Internet this further mutated to forms like kik, kekw, topkek etc. Nowadays in some places such as 4chan kek seems to be used even more than lol, it's the newer, "cooler" way of saying lol.

See Also


kids_these_days

Kids These Days

TODO


kiss

KISS

See also minimalism.

KISS (Keep It Simple, Stupid!) is a design philosophy that favors simplicity, both internal and external (simplicity of use), technology that is as simple as possible to achieve given task. This philosophy doesn't primarily stem from laziness and a want to save time (though these are valid reasons too), but mainly from the fact that higher complexity comes with increasingly negative effects such as the cost of development, cost of maintenance, greater probability of bugs (e.g. security vulnerabilities) and failure, more dependencies. This stance is related to technology minimalism.

Under dystopian capitalism simple technology, such as simple software, has at least one more advantage related to "intellectual property": a simple solution is less likely to step on a patent landmine because such a simple solution will either be hard to patent or as more obvious will have been discovered and patented sooner and the patent is more likely to already be expired. So in this sense KISS technology is legally safer.

Apparently the term KISS originated in the US Army plane engineering: the planes needed to be repairable by stupid soldiers with limited tools under field conditions.

Examples of KISS "solutions" include:

Compared to suckless, unix philosophy and LRS, KISS is a more general term and isn't tied to any specific group or movement, it doesn't imply any specifics but rather the general overall idea of simplicity being an advantage (less is more, worse is better, ...).

KISS Linux is an example of software developed under this philosophy and adapting the term itself.

See Also


kwangmyong

Kwangmyong

Kwangmyong (meaning bright light) is a mysterious intranet that North Koreans basically have instead of the Internet. For its high political isolation North Korea doesn't allow its citizens open access to the Internet, they rather create their own internal network the government can fully control -- this is unsurprising, allegedly it is e.g. illegal to own a fax and North Korea also have their own operating system called Red Star OS, for security reasons. Not so much is known about Kwangmyong for a number of reasons: it is only accessible from within North Korea, foreigners are typically not allowed to access it, and, of course, it isn't in English but in Korean. Of course the content on the network is highly filtered and/or created by the state propaganda. Foreigners sometimes get a chance to spot or even secretly photograph things that allow us to make out a bit of information about the network.

North Koreans themselves almost never have their own computers, they typically browse the network in libraries.

There seem to be a few thousand accessible sites. Raw IP addresses (in the private 10.0.0.0/8 range) are sometimes used to access sites (posters in libraries list IPs of some sites) but DNS is also up -- here sites use .kp top level domain. Some sites, e.g. of universities, are also accessible on the Internet (e.g. http://www.ryongnamsan.edu.kp/), others like http://www.ipo.aca.kp (patent/invention site) or http://www.ssl.edu.kp (sports site) are not. There seems to be a remote webcam education system in place -- it appeared on North Korean news. There exists something akin a search engine (Naenara), email, usenet, even something like facebook. Apparently there are some video games as well.

See Also


lambda_calculus

Lambda Calculus

Lambda calculus is an extremely simple and low-level mathematical system that can describe computations with functions, and can in fact be used to describe and perform any computation. Lambda calculus provides a theoretical basis for functional programming languages and is a model of computation similar to e.g. a Turing machine or interaction nets -- lambda calculus has actually exactly the same computational power as a Turing machine, which is the greatest possible computational power, and so it is an alternative to it. Lambda calculus can also be seen as a simple programming language, however it is so extremely simple (there are e.g. no numbers) that its pure form isn't used for practical programming, it is more of a mathematical tool for studying computers theoretically, constructing proofs etc. Nevertheless anything that can be programmed in any classic programming language can in theory be also programmed in lambda calculus.

While Turing machines use memory cells in which computations are performed -- which is similar to how real life computers work -- lambda calculus performs computations only by simplifying an expression made of pure mathematical functions, i.e. there are no global variables or side effects (the concept of memory is basically present in the expression itself, the lambda expression is both a program and memory at the same time). It has to be stressed that the functions in question are mathematical functions, also called pure functions, NOT functions we know from programming (which can do all kinds of nasty stuff). A pure function cannot have any side effects such as changing global state and its result also cannot depend on any global state or randomness, the only thing a pure function can do is return a value, and this value has to always be the same if the arguments to the function are same.

How It Works

(For simplicity we'll use pure ASCII text. Let the letters L, A and B signify the Greek letters lambda, alpha and beta.)

Lambda calculus is extremely simple in its definition, but it may not be so simple to learn to understand it. Most students don't get it the first time, so don't worry :)

In lambda calculus function have no names, they are what we'd call anonymous functions or lambdas in programming (now you know why they're called lambdas).

Computations in lambda calculus don't work with numbers but with sequences of symbols, i.e. the computation can be imagined as manipulating text strings with operations that can intuitively just be seen as "search/replace". If you know some programming language already, the notation of lambda calculus will seem familiar to functions you already know from programming (there are functions, their bodies, arguments, variables, ...), but BEWARE, this will also confuse you; functions in lambda calculus work a little different (much simpler) than those in traditional programming languages; e.g. you shouldn't imagine that variables and function arguments represent numbers -- they are really just "text symbols", all we're doing with lambda calculus is really manipulating text with very simple rules. Things like numbers, their addition etc. don't exist at the basic level of lambda calculus, they have to be implemented (see later). This is on purpose (feature, not a bug), lambda calculus is really trying to explore how simple we can make a system to still keep it as powerful as a Turing machine.

In lambda calculus an expression, also a lambda term or "program" if you will, consists only of three types of syntactical constructs:

  1. x: variables, represent unknown values (of course we can use also other letters than just x).
  2. (Lx.T): abstraction, where T is a lambda term, signifies a function definition (x is a variable that's the function's parameter, T is its body).
  3. (S T): application of S to T, where S and T are lambda terms, signifies a function call/invocation (S is the function, T is the argument).

For example (La.(Lb.x)) x is a lambda term while xLx..y is not.

Brackets can be left out if there's no ambiguity. Furthermore we need to distinguish between two types of variables:

Every lambda term can be broken down into the above defined three constructs. The actual computation is performed by simplifying the term with special rules until we get the result (similarly to how we simplify expression with special rules in algebra). This simplification is called a reduction, and there are only two rules for performing it:

  1. A-conversion: Renames (substitutes) a bound variable inside a function, e.g. we can apply A-conversion to Lx.xa and convert it to Ly.ya. This is done in specific cases when we need to prevent a substitution from making a free variable into a bound one.
  2. B-reduction: Takes a body of a function and replaces a parameter inside this body with provided argument, i.e. this is used to reduce applications. For example (Lx.xy) a is an application (we apply (Lx.xy) to a ). When we apply B-reduction, we take the function body (xy) and replace the bound variable (x) with the argument (a), so we get ay as the result of the whole B-reduction here.

A function in lambda calculus can only take one argument. The result of the function, its "return value", is a "string" it leaves behind after it's been processed with the reduction rules. This means a function can also return a function (and a function can be an argument to another function), which allows us to implement functions of multiple variables with so called currying.

For example if we want to make a function of two arguments, we instead create a function of one argument that will return another function of one argument. E.g. a function we'd traditionally write as f(x,y,z) = xyz can in lambda calculus be written as (Lx.(Ly.(Lz.xyz))), or, without brackets, Lx.Ly.Lz.xyz which will sometimes be written as Lxyz.xyz (this is just a syntactic sugar).

This is all we need to implement any possible program. For example we can encode numbers with so called Church numerals: 0 is Lf.Lx.x, 1 is Lf.Lx.fx, 2 is Lf.Lx.f(fx), 3 is Lf.Lx.f(f(fx)) etc. Then we can implement functions such as an increment: Ln.Lf.Lx.f((nf)x), etc.

Let's take a complete example. We'll use the above shown increment function to increment the number 0 so that we get a result 1:

(Ln.Lf.Lx.f((nf)x) (Lf.Lx.x)     application
(Ln.Lf.Lx.f((nf)x) (Lf0.Lx0.x0)  A-conversion (rename variables)
(Lf.Lx.f(((Lf0.Lx0.x0)f)x)       B-reduction (substitution)
(Lf.Lx.f((Lx0.x0)x)              B-reduction
(Lf.Lx.fx)                       B-reduction

We see we've gotten the representation of number 1.

TODO: C code

See Also


langtons_ant

Langton's Ant

Langton's ant is a simple zero player game and cellular automaton simulating the behavior of an ant that behaves according to extremely simple rules but nevertheless builds a very complex structure. It is similar to game of life. Langton's ant is Turing complete (it can be used to perform any computation that any other computer can).

Rules: in the basic version the ant is placed in a square grid where each square can be either white or black. Initially all squares are white. The ant can face north, west, south or east and operates in steps. In each step it does the following: if the square the ant is on is white (black), it turns the square to black (white), turns 90 degrees to the right (left) and moves one square forward.

These simple rules produce a quite complex structure, seen below. The interesting thing is that initially the ant behaves chaotically but after about 10000 steps it suddenly ends up behaving in an ordered manner by building a "highway" that's a non-chaotic, repeating pattern. From then on it continues building the highway until the end of time.

...........................................................................
.............................................................##............
............................................................##......##.....
...........................................................#.##.#..#..#....
...........................................................#..#.###..###...
.....##.....................................................#.##..####.#...
....##........................................................##...........
...#.##.#.....................................................##.##.##.....
...#..#.##................................................#.##.#..####.....
....#.A.#.#................................##....##....##.###.##.#####.....
....#...#.##..............................##..####....##..#.##.#.#..#......
...###..#.#.#............................#.##..####....####.###.####.......
...#####.#..##......................##...#.......##..##....#...#.###.......
....#..###..#.#....................#..#..#......#..##..##...##.####........
.....###...#..##..................#..#...#.......##.##...#..#..##.#........
......#..###..#.#.................#..#....#.########.#.#.##..####.#........
.......###...#..##..........##..##.#.#.#....##.##.#.#.##..#..##..##........
........#..###..#.#........#####.#..##...##.#...#....#.#..#..#..#.#........
.........###...#..##......#.##...##...#..#...####..#...##.####.##..........
..........#..###..#.#.....#....#...####.#..#####.##...##########...##......
...........###...#..##....#.....#.##.#####.##..#.#...#..#..##.#..#..#......
............#..###..#.#...#.....#####.#.#####.....#.#..##.#....##...#......
.............###...#..##...#..#.######.##.#.##.#.#....###.###...##...#.....
..............#..###..#.#...##..#.##...##.##.###.###...#..#.##..####.#.....
...............###...#..##......#.####..##..#########..#..##....#..##......
................#..###..#.#..#...##..###########.#..####..#....#....#......
.................###...#..##.###..##.#...##.......####.####...#......#.....
..................#..###..#.#.#..#.###.#.#.##......##...#.#.#....#...#.....
...................###...#.....#.##.#.##..##..#####.####..####.##...#......
....................#..###..#.##....#..#.###..#......###.##.#..#..##.......
.....................###...#...#.#..#.#.####.##..#.##.###..#.....#.........
......................#..###..##...##.##...###..#....#..##.####...#........
.......................###...#.#.##.###..#..##.....#...###.##..##.#........
........................#..#.........##.##...#..##.....##.#.....##.........
...........................#.#...#.##.###...#...#.#..####....#.##..........
.......................#.###.#.##.#.#.##.##.##.#...#####.###.##............
.......................###.##...#.####..##.##.######.#.###.#...#...........
........................#.....#...#####.#.#..####..#...###.#.#.#...........
..........................#.###.##..#.##..###.#.#.....###...###............
..........................#.#..###..##.####.##...#.#..#.##..##.............
.........................###.#..#...#.....#.....##.##..###.................
............................##..##.#.#.........###.......#.................
................................#.#..#.........#..#....#...................
................................###...##............##.#...................
.................................#..####..........#..##....................
..................................##..############..##.....................
...........................................................................

Langton's ant after 11100 steps, A signifies the ant's position, note the chaotic region from which the highway emerges left and up.

The Langton's ant game can be extended/modified, e.g. in following ways:

The ant was invented/discovered by Christopher Langton in his 1986 paper called Studying Artificial Life With Cellular Automata where he calls the ants vants (virtual ants).

Implementation

The following is a simple C implementation of Langton's ant including the extension to multiple colors (modify COLORS and RULES).

#include <stdio.h>
#include <unistd.h>

#define FIELD_SIZE 48
#define STEPS 5000
#define COLORS 2      // number of colors
#define RULES 0x01    // bit map of the rules, this one is RL

unsigned char field[FIELD_SIZE * FIELD_SIZE];

struct
{
  int x;
  int y;
  char direction; // 0: up, 1: right, 2: down, 3: left
} ant;

int wrap(int x, int max)
{
  return (x < 0) ? (max - 1) : ((x >= max) ? 0 : x);
}

int main(void)
{
  ant.x = FIELD_SIZE / 2;
  ant.y = FIELD_SIZE / 2;
  ant.direction = 0;

  for (unsigned int step = 0; step < STEPS; ++step)
  {   
    unsigned int fieldIndex = ant.y * FIELD_SIZE + ant.x;
    unsigned char color = field[fieldIndex];

    ant.direction = wrap(ant.direction + (((RULES >> color) & 0x01) ? 1 : -1),4);

    field[fieldIndex] = (color + 1) % COLORS; // change color

    // move forward:

    switch (ant.direction)
    {
      case 0: ant.y++; break; // up
      case 1: ant.x++; break; // right
      case 2: ant.y--; break; // down
      case 3: ant.x--; break; // left
      default: break;
    }

    ant.x = wrap(ant.x,FIELD_SIZE);
    ant.y = wrap(ant.y,FIELD_SIZE);

    // draw:

    for (int i = 0; i < 10; ++i)
      putchar('\n');

    for (int y = 0; y < FIELD_SIZE; ++y)
    {
      for (int x = 0; x < FIELD_SIZE; ++x)
        if (x == ant.x && y == ant.y)
          putchar('A');
        else
        {
          unsigned char val = field[y * FIELD_SIZE + x];
          putchar(val ? ('A' + val - 1) : '.');
        }

      putchar('\n');
    }

    usleep(10000);
  }

  return 0;
}

See Also


left

Left

See left vs right.


left_right

Left Vs Right (Vs Pseudoleft)

Left and right are two basic opposing political sides that roughly come down to the pro-equality (left) and pro-hierarchy (right). Historically the terms left and right came from the opposing sides at which members of national assembly physically sit during 1789 French revolution, however since then they evolved into possessing more generalized meanings of simply anti and pro hierarchy. Unfortunately there is a lot of confusion and vagueness about these terms nowadays, so let us now define them as used on this wiki:

There exists a "theory" called a horse shoe. It says that the extremes of the left-right spectrum tend to be alike (violent, hating, radical), just as the two ends of a horse shoe. This is only an illusion caused by ignoring the existence of pseudoleft. The following diagram shows the true situation:

TRUE LEFT (peace, selflessness, forgiveness, ...)
     <-------------------.
               /          \
              |            |  <== illusion of horse shoe
              |            |
               \          /
                V        V
           PSEUDOLEFT  RIGHT
    (violence, conflict, aggressivity, ...)

We see pseudoleft is something that began as going away from the right but slowly turned around back to its values, just from a slightly different direction. This is because rightism is very easy, it offers tempting short-term solutions such as violence, and so it exhorts a kind of magnetic force on every human -- most cannot resist and only very few get to the true left despite this force.

The current US-centered culture unfortunately forces a right-pseudoleft false dichotomy. It is extremely important to realize this dichotomy doesn't hold. Do not become type A/B fail.

What's called left in the modern western culture usually means pseudoleft. The existence of pseudoleftism is often overlooked or unknown. It used to be found mainly in the US, however globalization spreads this cancer all over the world. Pseudoleft justifies its actions with a goal that may seem truly leftist, such as "equality", but uses means completely unacceptable by true left (which are in fact incompatible with equality), such as violence, bullying, lynching, cancelling, censorship or brainwashing. Pseudoleft is aggressive. It believes that "ends justify the means" and that "it's fine to bully a bully" ("eye for an eye"). A pseudoleftist movement naturally evolves towards shifting its goals from a leftist one such as equality towards a fascist one such as a (blind) fight for some group's rights (even if that group may already have achieved equality and more).

The difference between left and pseudoleft can be shown in many ways; one of them may be that pseudoleft always wants to fight something, usually the right (as they're essentially the same, i.e. natural competitiors). True left wants to end all fights. Pseudoleft invents bullshit artificial issues such as political correctness that sparks conflict, as it lives by conflict. Left tries to find peace by solving problems. Pseudoleft sees it as acceptable to do bad things to people who commited something it deems bad. True left knows that violence creates violence, it "turns the other cheek", it cures hate with love.

Pseudoleft is extra harmful by deceiving the public into thinking what it does really is leftist. Most normal people that don't think too much therefore stand between a choice of a lesser of two evils: the right and pseudoleft. True left, the true good, is not known, it is overshadowed.

Why is there no pseudoright? Because it doesn't make sense :) Left is good, right is a sincere evil and pseudoleft is an evil pretending to be good. A good pretending to be evil doesn't probably exist in any significant form.

Centrism means trying to stay somewhere mid way between left and right, but it comes with issues. From our point of view it's like trying to stay in the middle of good and evil, it is definitely pretty bad to decide to be 50% evil. Another issue with centrism is that it is unstable. Centrism means balancing on the edge of two opposing forces and people naturally tend to slip towards the extremes, so a young centrist will have about equal probabilities of slipping either towards extreme left or extreme right, and as society polarizes this way, people become yet more and more inclined to defend their team. Knowing centrism is unsustainable, we realize we basically have to choose which extreme to join, and we choose the left extreme, i.e. joining the good rather than the evil.


less_retarded_hardware

Less Retarded Hardware

Less retarded hardware (LRH) is an extension of less retarded software (LRS) principles to hardware design. Such hardware has to be non-consumerist, designed to last and free (as in freedom) hardware completely from the lowest level, preferably completely public domain without any legal limitations, made with selfless goals, aiming to be good technology that helps all living beings without abusing them -- this implies the hardware has to be as simple as possible (KISS, suckless, ...) so as to maximize the number of people who can understand it, utilize it, improve it and repair it. An example of hardware coming close to this may potentially be e.g. Ronja.


less_retarded_society

Less Retarded Society

Less retarded society (LRS, same acronym as less retarded software) is a model of ideal society towards which we, the LRS, want to be moving. Less retarded society is a peaceful, collaborative society based on love of all life, which aims for maximum well being of all living beings, a society without violence, money, oppression, need for work, social competition, poverty, scarcity, criminality, censorship, self-interest, government, police, laws, bullshit, slavery and many other negative phenomena. It equally values all living beings and establishes true social equality in which everyone can pursue his true desires freely -- it is a TRULY leftist society, NOT a pseudoleftist one. The society works similarly to that described by the Venus Project and various anarchist theories (especially anarcho pacifist communism), but it also takes good things from elsewhere, even various religions (without itself actually becoming a religion in traditional sense); for example parts of teaching of Jesus and Buddha.

How is this different from other ideologies and "life philosophies"? Well, one principal difference is that LRS doesn't want to fight; nowadays as well as in the past society has always been about conflict, playing a game against others (nowadays e.g. market competition, employment competition, media competition, ...) in which some win, some can manage and some lose. Most political parties nowadays just want to change the rules of the game or downright switch to a different kind of game, some want to make the rules "more fair", or to make it favor their represented minority (so called fascism), some just want to hack the game, some want to cheat to win the game easily, some want to play fair but still win (i.e. become "successful"). LRS simply sees any kind of such game as unnecessary, cruel, unethical and harmful in many ways not just to us, but to the whole planet. LRS therefore simply wants to stop the game, not by force but by making everyone see how bad the game is. It says that competition and conflict must seize to be the basis of society.

Note that this society is an ideal model, i.e. it can probably not be achieved 100% but it's something that gives us a direction and to which we can get very close with enough effort. We create an ideal theoretical model and then try to approximate it in reality, which is a scientific approach that is utilized almost everywhere: for example mathematics defines a perfect sphere and such a model is then useful in practice even if we cannot ever create a mathematically perfect sphere in the real physical world -- the mathematical equations of a sphere guide us so that with enough effort we are able to create physical spheres that are pretty close to an ideal sphere. The same can be done with society. This largely refutes the often given argument that "it's impossible to achieve so we shouldn't try at all" -- we should try our best and the closer to the ideal we get, the better for us.

Basis: Love Of All Life

When thinking about how to change society for the better, the first thing that needs to be done is defining a goal which the society should aim for -- an axiom which serves as a measure of what's objectively good and bad, which in turn helps us take the right steps towards the good. This is only logical, without a goal we aren't really trying to achieve anything and "good" and "evil" are just words without any objective meaning.

The basis of less retarded society is a universal and unconditional selfless love of all life; the kind of love a glimpse of which you catch when you for example observe an animal play and be happy of just existing as a living being that's able to feel joy and happiness. This kind of love and the strong emotion associated with it is to us possibly the greatest miracle of the universe and so we choose to support it, make it flourish, we define it as an axiom that life which experiences joy and happiness is good. Similarly we define it as bad when life feels suffering or when there is little or even no life in the universe. Here we set the goal for our society to support life, make it flourish, and make individual living beings feel happiness.

We purposefully make this goal a little bit vague, we avoid specifying our basic goal with exact mathematical metrics because defining maximization of any such measure as a goal leads to undesired results (as for example in capitalism setting the goal to maximizing capital leads to maximizing it on the detriment of all other values such as well being of people). This is known as the Goodhart's law: "when a metric becomes a goal, it stops being a good metric".

What does love of all life mean exactly? As hinted above, it does NOT necessarily mean maximizing specific measures such as abundance of life (which could lead to overpopulation and in turn to suffering), the sum of happiness of all life (which could lead to just dosing everyone with drugs or killing unhappy individuals), elimination of negative emotion such as hatred (which would likely prevent us from recognizing wrong directions of our society), it doesn't mean respect towards everyone etc. It doesn't even mean that we will never kill anyone on purpose, our society may support euthanasia without violating its principles. Love of all life mostly means that we start behaving selflessly and altruistically instead of pursuing self interest, at least as much as we practically can. It means that we start seeing the life on our whole planet (and possibly in the whole universe) as our own family, not as our enemies. This doesn't mean we will like everyone, that we'll agree with everyone's opinions, that we won't criticize anyone or that we'll be politically correct etc., it just means that we will never try to cause suffering to others, that we'll try to not exploit others, that we'll be aware of the needs of others and try to behave towards them with empathy and love. Importantly we will try to pursue these ideals even if we can't achieve perfection.

Basic Description

The following is a basic description of just some features of the ideal society, some of which are however only speculative. Keep in mind it is impossible to plan a whole society exactly -- even if some of the speculations here turn out to be somehow erroneous, it probably still doesn't present a fatal obstacle to implementing our society, things may simply just turn out differently or to be more or less challenging than we predict.

Our society is anarcho pacifist and communist, meaning it rejects capitalism, money, violence, war, states, social hierarchy etc. I.e. in our society money, market, capitalism, consumerism, private property, wage labor and trade don't exist, people are free and happy as they can pursue their true interests and potential.

People don't have to work, almost everything is automated and the amount of work needed to be done is minimized by eliminating unnecessary bullshit jobs such as marketing, lawyers, insurance, politicians, state bureaucracy, creation of consumer entertainment and goods etc. One of the basic principles of our society is that any individual can simply live, without having to deserve this right by proving worth, usefulness, obedience etc. The little remaining human work that's necessary is done voluntarily.

Society is NOT based on competition, but rather on collaboration. Making people compete for basic life resources is seen as highly cruel and unethical. The natural need for competition is still satisfied with games and sports, but people know competition is kind of a poison and know they have to practice self control to not allow competitive tendencies in real life.

There is abundance of resources for everyone, poverty is non existent, artificial scarcity is no longer sustained by capitalism. There is enough food and accommodation for everyone, of course for free, as well as health care, access to information, entertainment, tools and so on. Where there used to be shopping centers, parking lots, government buildings and skyscrapers, there are now fields and food banks and people voluntarily collaborate on automating production of food on them.

States and governments don't exist, there are no artificial borders. Society self regulates and consists of decentralized, mostly self-sufficient communities that utilize their local resources as much as they can and send abundant resources to communities that lack them. The is no law in the sense of complex written legislation, no lawyers, courts and police, society works on the principle of moral laws, education and non-violent actions (e.g. refusal of people to use money etc.). Communities aren't hugely interdependent and hyperspecialized as in capitalism so there is no danger of system collapse. Many decisions nowadays taken by politicians, such as those regarding distribution of resources, are in our ideal society made by computers based on collected data and objective scientific criteria.

Criminality doesn't exist, there is no motivation for it as everyone has abundance of everything, no one carries guns, people don't see themselves as competing with others in life and everyone is raised in an environment that nurtures their peaceful, collaborative, selfless loving side. People with "criminal genes" have become extinct thanks to natural selection by people voluntarily choosing to breed with non-violent people. Conflict between people is minimized by the elimination of self interest (and need for it) -- a lot of violence in current society comes from disagreement which comes from everyone's different goals (everyone aims to benefit oneself); in our society this is no longer the case, people rarely disagree on essential decisions because decisions are driven by pure facts collected without distortion or suspicion of self interest.

Technology is simple, powerful, efficient, future proof, ecological, generally good and maximally helps people. Internet is actually nice, it provides practically all information ever digitized, for example there is a global database of all videos ever produced, including movies, educational videos and documentaries, all without ads, DRM and copyright strikes, coming with all known metadata such as tags, subtitles, annotations and translations and are accessible by many means (something akin websites, APIs, physical media ...), all videos can be downloaded, mirrored and complex search queries can be performed, unlike e.g. with YouTube. Satellite images, streams from all live cameras and other sensors in the world are easily accessible in real time. Search engines are much more powerful than Google can dream of as data is organized efficiently and friendly to indexing, not hidden behind paywalls, JavaScript obscurity or registrations to websites, which means that for example all text of all e-books is indexed as well as all conversations ever had on the Internet and subtitles of videos. All source code of all programs is available for unlimited use by anyone. There are only a few models of standardized computers -- a universal public domain computer -- not thousands of slightly different competing products as nowadays. There is a tiny, energy efficient computer model, then a more powerful computer for complex computations, a simple computer designed to be extremely easy to manufacture etc. None of course have malicious features such as DRM, gay teenager aesthetics, consumerist "killer features" or planned obsolescence. All schematics are available. People possibly wear personal wrist-watch-like computers, however these are nothing like today's "smart" watches/phones -- our wrist computers are completely under the user's control, without any bullshit, spyware, ads and other malicious features, they last weeks or months on battery as they are in low energy consumption mode whenever they're not in use, they run extremely efficient software and are NOT constantly connected to the Internet and updating -- as an alternative to connecting to the Internet (which is still possible but requires activating a transmitter) the device may just choose to receive a world-wide broadcast of general information (which only requires a low power consumption receiver) if the user requests it (similarly to how teletext worked), e.g. info about time, weather or news that's broadcasted by towers and/or satellites and/or small local broadcasters. Furthermore wrist computers are very durable and water proof and may have built-in solar chargers, so one wrist computer works completely independently and for many decades. They have connectors to attach external devices like keyboards and bigger displays when the user needs to use the device comfortably at home. The computing world is NOT split by competing standards such as different programming languages, most programmers use just one programming language similar to C that's been designed to maximize quality of technology (as opposed to capitalist interests such as allowing rapid development by incompetent programmers or update culture).

Fascism doesn't exist, people no longer compete socially and don't live in fear (of immigrants, poverty, losing jobs, religious extremists etc.) that would give rise to militarist thought, society is multicultural and races highly mixed. There is no need for things such as political correctness and other censorship, people acknowledge there exist differences -- differences (e.g. in competence or performance) don't matter in a non-competitive society, discrimination doesn't exist.

Computer security is not an issue anymore, passwords and encryption practically don't exist anymore, there is nothing to "steal", no money on the Internet, no way to abuse personal data, no possibility to ruin someone's career, no celebrity accounts to hack etc.

All people speak the same language, possibly Esperanto or Lojban. Though some speak multiple languages, most of the world languages have become archaic and are studied e.g. for the sake of understanding old texts. Of course dialects and different accents of the world language appear, but all are mutually intelligible thanks to constant global communication and also people being so responsible as to willingly try to not diverge from the main form too much.

People don't wear clothes unless for practical reasons (weather, safety, ...). Fashion and shame of nudity doesn't exist and it is seen as wasteful to keep manufacturing, cleaning and recycling more clothes than necessarily needed. Of course it is NOT forbidden to wear or make clothes, people just mostly naturally don't engage in unnecessary, wasteful activity.

Anyone can have sex with anyone, without hurting anyone of course, but there are no taboo limitations like forbidden incest, sex with children, animals or dead bodies, everything is allowed and culturally acceptable as long as no one gets hurt. "Cheating" in today's sense doesn't exist, marriage doesn't exist, people regularly have sex with many other people just to satisfy the basic need. People have learned to separate sex and love. Of course many people still live in life-long partner relationships.t

Cannibalism is acceptable as long as high hygiene is respected as it puts a dead body to good use instead of wasting food by burying it or burning it. Even though most people don't practice cannibalism, it is perfectly acceptable that some do. Many people wish to be eaten after death either by people or by animals (as for example some Buddhists do even nowadays).

There are no heroes or leaders. People learn from young age that they should follow ideas, not people, and that cults of personality are dangerous. There are known experts in different disciplines and areas of science, but no celebrities, experts aren't worshiped, their knowledge is treated the same as we nowadays e.g. treat information that we find in a database. This doesn't mean there aren't people who lead good moral examples and whose behavior is admired, people are just separated from their actions -- all people are loved unconditionally, some had the opportunity to take admirable actions and took it, some were born to perform well in sports or excel in science, but that's no reason to love the individual any more or any less or to worship him as a god.

Education is actually good, people (not only children) attend schools voluntarily (though such "schools" will be extremely different from what the word means today), there are no grades, degrees or tests that need to be passed or prescribed courses, only recommendations and guidance of other people. There is no strict division to students and teachers, teachers are students at the same time, older people teach younger. There may of course exist voluntary tests that people can take to test their knowledge and competence, but no one is forced to pass tests to continue studying etc.

People don't kill or otherwise abuse and torture animals, artificial meat is widely available.

People don't have tattoos, dyed hair, piercing etc., that's simply egoistic bullshit of our individualist age. It is correctly seen as immoral to try to persuade by "good looks" -- for example by wearing a suit -- that's simply a cheap attempt at deception. Everyone is valued the same no matter his looks, people don't feel the need to change their gender or alter their look so as to appeal to anyone or to follow some kind of fashion or trend or to infiltrate specific social class. Of course cutting hair e.g. for comfort is practiced, but no one wastes their time with makeup and similar nonsense.

People live in harmony with nature, the enormous waste of capitalism and consumerist society has been eliminated, industry isn't raping nature, cities are greener and more integrated with nature, people live in energy-efficient underground houses, there are fewer roads as people don't use cars much thanks to efficient public transport and lower need for travel thanks to not having to go to work etc.

Research advances faster, people are smarter, more rational, emphatic, loving and more moral. Nowadays probably the pamajority of the greatest brains are wasted on bullshit activity such as studying and trying to hack the market -- even professional researchers nowadays waste time on "safe", lucrative unnecessary bullshit research in the "publish or perish" spirit, chasing grants, easy patents etc. In our ideal society smart people focus on truly relevant issues such as curing cancer, without being afraid of failure, stalling, negative results, lack of funds etc. People are responsible and practice e.g. voluntary birth control to prevent overpopulation. However people are NOT cold rational machines, emotions are present much more than today, for example the emotion of love towards life is so strong most people are willing to die to save someone else, even a complete stranger. People express emotion through rich art and good deeds. People are also spiritual despite being highly rational -- they know rationality is but one of many tools for viewing and understanding the world. Religion still exists commonly but not in radical or hostile forms; Christianity, Islam and similar religions become more similar to e.g. Buddhism, some even merge after realizing their differences are relatively unimportant, religion becomes much less organized and much more personal.

Art is rich and unrestricted (no copyright or other "IP" exists), with people being able to fully dedicate their lives to it if they wish and with the possibility to create without distraction such as having to make living or dealing with copyright. People collaborate and reuse each other's works, many free universes exist, everyone can enjoy all art without restriction and remix it however he wishes.

People live much longer and healthier lives thanks to faster research in medicine, free healthcare, better food, less pollution, higher living standard, more natural life closer to nature, minimization of stress and elimination of the antivirus paradox from medicine. They also die more peacefully thanks to having lived a rich, fulfilling lives, they die in the circle of their family and are not afraid of death, they take it as natural part of life. Euthanasia is allowed and common for those who wish to die for whatever reason.

FAQ

How To Implement It

This is the hard part, however after successfully setting things in motion it may start to become much easier and eventually even inevitable that the ideal society will be closely approached. However at the moment society seems too spoiled and change of a direction seems very unlikely, it seems more probable that we will destroy ourselves or enslave ourselves forever -- capitalism and similar misdirections of society connected to self-interest, competition, fascism etc. pose a huge threat to our endeavor and may ruin it completely, so they need to be strictly opposed, but in a CORRECT way, i.e. not by revolutions and violence but rather by education, offering alternatives and leading examples (i.e. means aligned with our basic values). It has to be stressed that we always need to follow our basic values of nonviolence, love, true rationality etc., resorting to easy ways of violence etc. will only prolong the established cycle of suffering in the society which we are trying to end. Remember, we are not creating a revolution, we aims for a rather slow, nonviolent, voluntary evolutional change.

We already have technology and knowledge to implement our ideal society -- this may have been the most difficult part and it has already been achieved -- that's the good news.

For the next phase education is crucial, we have to spread our ideas further, first among the intellectuals, then to the masses. By this we seek to unretard society. Unfortunately this phase is still in its infancy, vast majority of intellectuals are completely uneducated in this area -- this we have to change. There are a few that support parts of our plan such as simple technology, nonviolence, not hurting animals etc., but almost no one supports them all, or see the big picture -- we need to unite these people (see also type A/B fail) to form a small but dedicated community sharing all the proposed ideas. This community will then be able to collaborate on further education, e.g. by creating materials such as books, games, vlogs, giving talks etc.

With this more of the common people should start to jump on the train and support causes such as universal basic income, free software etc., possibly leading to establishment of communities and political parties that will start restricting capitalism and implementing a more socialist society with more freedom and better education, which should further help nurture people better and accelerate the process further. From here on things should become much easier and faster, people will already see the right direction themselves.

See Also


less_retarded_software

Less Retarded Software

Please kindly redirect yourself to LRS.


lgbt

LGBT

This article is a part of series of articles on fascism.

LGBT, LGBTQ+, LGBTQIKKAWANSQKKALQNMQW (lesbian gay, bisexual, transsexual, queer and whatever else they're gonna invent) is a toxic pseudoleftist fascist political group whose ideology is based on superiority of certain selected minority sexual orientations. They are a highly violent, bullying movement (not surprisingly centered in the US but already spread around the whole world) practicing censorship, Internet lynching (cancel culture), discrimination, spread of extreme propaganda, harmful lies, culture poison such as political correctness and other evil.

LGBT is related to the ideas of equality in a similar way in which crusade wars were related to the nonviolent teaching of Jesus, it shows how an idea can be completely twisted around and turned on its head so that it's directly contradicting its original premise.

Note that not all gay people support LGBT, even though LGBT wants you to think so and media treat e.g. the terms gay and LGBT as synonyms (this is part of propaganda, either conscious or subconscious). The relationship gay-LGBT is the same as e.g. the relationship White-WhitePride or German-Nazi: Nazis were a German minority that wanted to fight for more privileges for Germans (as they felt oppressed by Jews), LGBT is a gay minority who wants to fight for more privileges for gay people (because they feel oppressed by straight people). LGBT isn't just about being gay but about approving of a very specific ideology that doesn't automatically come with being gay. LGBT frequently comments on issues that go beyond simply being gay (or whatever), for example LGBT openly stated disapproval of certain other orientation (e.g. pedophilia) and refuses to admit homosexuality is a disorder, which aren't necessarily stances someone has to take when simply being gay.

LGBT works towards establishing newspeak and though crime, their "pride" parades are not unlike military parades, they're meant to establish fear of their numbers. LGBT targets children and young whom their propaganda floods every day with messages like "being gay makes you cool and more interesting" so that they have a higher probability of developing homosexuality to further increase their ranks in the future. They also push the idea of children having same sex parents for the same reason.

LGBT oppose straight people as they solely focus on gaining more and more rights and power only for their approved orientations. They also highly bully other, unpopular sexual orientations such as pedophiles (not necessarily child rapists), necrophiles and zoophiles, simply because supporting these would hurt their popularity and political power. They label the non-approved orientations a "disorder", they push people of such orientations to suicide and generally just do all the bad things that society used to do to gay people in the past -- the fact that these people are often gay people who know what it's like to be bullied like that makes it this even much more sad and disgusting. To them it doesn't matter you never hurt anyone, if they find some loli images on your computer, you're gonna get lynched mercilessly.

In the world of technology they are known for supporting toxic codes of conduct in FOSS projects (so called tranny software), they managed to push them into most mainstream projects, even Linux etc. Generally they just killed free speech online as well as in real life, every platform now has some kind of surveillance and censorship justified by "preventing offensive speech". They cancelled Richard Stallman for merely questioning a part of their gospel. They also managed to establish things like "diversity" quotas in Hollywood that only allow Oscars to be given to movies made by specific number of gays, lesbians etc., and they started to insert gay characters into fairy tales and movies for children (Toy Story etc.) xD This is literally the same kind of cheap but effective propaganda Nazi Germany employed on children; it's just that now after nationalism has been demonized after the world war we replaced nationalism with gender identity, an exactly same thing in principle just with a different name. Apparently in the software development industry it is now standard to pretend to be a tranny on one's resume so as to greatly increase the chance of being hired for diversity quotas xD WTF if I didn't live in this shitty world I wouldn't believe that's even possible, in a dystopian horror movie this would feel like crossing the line of believability too far lmao.


library

Library

Software library (often shortened to just lib) is program code that's not meant to run on its own but rather to be used by other programs, i.e. it is a helpful collection of preprogrammed code that's meant to be reused. A library provides resources such as functions, macros, classes or constants that are normally related to solving some specific class of problems, so e.g. there are GUI libraries, audio libraries, mathematical libraries etc. Libraries exist mostly to prevent reinventing wheels by only ever implementing the code once so that next time we can simply reuse it (respecting the DRY principle), but they also e.g. help assure others are using an already well tested code, they help to implement modularity etc. Examples of libraries are the standard C library, SDL or JQuery. Libraries are not to be confused with frameworks which are larger, more bloated environments.

In Unix environments there is a convention for library packages to start with the lib prefix, so e.g. SDL library is named libsdl etc.

Standard library (stdlib) is a term that stands for the set of libraries that officially come with given programming language -- these libraries usually offer very basic functionality (such as I/O and basic math) and are required to always be present on every system.

If a programmer wants to use a specific library, he usually has to first somehow install it (if it's not installed already, usually a library is some kind of software package) and then include it in his program with a specific command (words like include, using or import are commonly used). Then he is able to use the resources of the library. Depending on the type of the library he may also need to link the library code after compilation and possibly distribute the library files along with his program. A more KISS approach is for a library to simply be a code that's somehow copy-paste included in the main program (see single header libraries etc.).

As a programmer you will encounter the term library API -- this is the interface of the library consisting of the elements via which programmer uses the library, mostly the functions the library offers. API is what the programmer interacts with; the rest is library internals (its implementation) that's usually supposed to not be touched and stay a bit hidden (see encapsulation). If a programmer wants to know the library API, he wants to know the names of the functions, what parameters they take etc. Sometimes there may be multiple libraries with the same API but different internal implementations, this is nice because these libraries can be easily drop-in-replaced. The library API is usually part of its documentation -- when learning a new library X, you want to search the internet for something like X library API reference to see what functions it offers.

In a specific programming language it IS generally possible to use a library written in a different language, though it may be more difficult to achieve -- see language bindings and wrappers.

We generally divide libraries to two types:

Many times a library can have both static and dynamic version available, or the compiler may allow to automatically link the library as static or dynamic. Then it's up to the programmer which way he wants to go.

C Libraries

TODO: example

Header only or single header library is a kind of keep-it-simple library that's wholly implemented in a single header (.h) file -- this is kind of a hack going against "official recommendations" as header files aren't supposed to contain implementation code, just declarations, however single header libraries are suckless/LRS, convenient and very easy to use, as they don't have to be linked and are nicely self-contained, distributed in one nice file. A traditional library would consist of one or more header (.h) files and one or more implementation (.c) files; such library has to be compiled on its own and then linked to the program that uses it -- the idea behind this was to compile the library only once and so save time on recompiling it again and again; however this justification is invalid if our library is simple enough and compiles very quickly (which it always should, otherwise we are dealing with badly designed bloat). A single header library can therefore just be included and just works, without any extra hassle -- yes, its code recompiles every time the program is compiled, but as stated, it doesn't hurt if our library is well designed and therefore simple. Single header libraries often include the option (via some #define macro) to include just the declaration or both declarations and implementation code -- this is useful if our main program is composed of multiple source files and needs to be linked. LRS libraries, such as small3dlib or raycastlib, are of course single header libraries.

LRS Libraries

TODO


libre

Libre

Libre is an alternative term for free (as in freedom). It is used to prevent confusion of free with gratis.


license

License

License is a legal text by which we share some of our exclusive rights (e.g. copyright) over intellectual works with others. For the purpose of this Wiki a license is what enables us to legally implement free (as in freedom) software (as well as free culture): we attach a license to our program that says that we grant to everyone the basic freedom rights to our software with optional conditions (which must not be in conflict with free software definition, e.g. we may require attribution or copyleft, but we may NOT require e.g. non-commercial use only). We call these licenses free licenses (open source licenses work the same way). Of course, there also exist non-free licenses called EULAs, but we stay away from these.

At LRS we highly prefer public domain waivers instead of licenses, i.e. we release our works without any conditions/restrictions whatsoever (e.g. we don't require credit, copyleft and similar conditions, even if by free software rules we could). This is because we oppose the very idea of being able to own information and ideas, which any license is inherently based on. Besides that, licenses are not as legally suckless as public domain and they come with their own issues, for example a license, even if free, may require that you promote some political ideology you disagree with (see e.g. the principle of +NIGGER).

Some most notable free licenses for software include (FSF: FSF approved, OSI: OSI approved, LRS: approved by us, short: is the license short?):

license type FSF OSI LRS short
Apache 2 permissive, conditions + + - -
AGPL network copyleft + + - -
BSD (0,1,2,3) permissive + + - +
BOML permissive - - - +
CC0 PD waiver, 0 conditions + - + -
GPLv2, GPLv3 copyleft (strong) + + - -
LGPL copyleft (weak) + + - -
MIT permissive, credit + + + +
MIT-0 permissive, 0 conditions - + + +
Unlicense PD waiver, 0 conditions + + + +
WTFPL permissive, fun + - - +
zlib permissive + + - +
0BSD permissive, 0 conditions - + + +

Some most notable free licenses for general artworks include:

TODO

How To

If you're a noob or even an advanced noob and want to make sure you license correctly, consider the following advice:


lil

LIL

There is an old language called LIL (little implementation language), but this article is about a different language also called LIL (little interpreted language by Kostas Michalopoulos).

Little interpreted language (LIL) is a very nice suckless, yet practically unknown interpreted programming language by Kostas Michalopoulos which can very easily be embedded in other programs. In this it is similar to Lua but is even more simple: it is implemented in just two C source code files (lil.c and lil.h) that together count about 3700 LOC. It is provided under zlib license. More information about it is available at http://runtimeterror.com/tech/lil.

{ LIL is relatively amazing. I've been able to make it work on such low-specs hardware as Pokitto (32kb RAM embedded). ~drummyfish }

LIL has two implementations, one in C and one in Free Pascal, and also comes with some kind of GUI and API.

The language design is very nice, its interesting philosophy is that everything is a string, for example arithmetic operations are performed with a function expr which takes a string of an arithmetic expression and returns a string representing the result number.

For its simplicity there is no bytecode which would allow for more efficient execution and optimization.

TODO: example

{ I've been looking at the source and unfortunately there are some imperfections. The code uses goto (may not be bad but I dunno). Also unfortunately stdlib, stdio, string and other standard libraries are used as well as malloc. The code isn't really commented and I find the style kind of hard to read. }

See Also


linear_algebra

Linear Algebra

In mathematics linear algebra is an extension of the classical elemental algebra (which means the basic "operations with numbers/variables") to vectors and matrices (kind of "operations with arrays of numbers"). It is a basic tool of advanced mathematics and computer science (and many other sciences) and at least at the very basic level should be known by every programmer.

Why is it called linear algebra? It is related to the concept of linearity which kind of has to do with "dealing with straight lines" (NOT curved ones), i.e. the results we get in linear algebra are more abstract equivalents of "straight lines", e.g. planes and hyperplanes, though this may be hard too see due to all the abstraction and higher dimensionality. { The concept of linearity has several possibly incompatible definitions and is kinda confusing (for example in whether the lines always have to pass through the origin). I was actually looking up "why is it called LINEAR algebra" and the above explanation is how I understood the answers I found. ~drummyfish }

Basics

In "normal" algebra our basic elements are numbers; we learn to add then, multiply then, solve equation with them etc. In linear algebra we call these "single numbers" scalars (e.g. 1, -10.5 or pi are scalars), and we also add more complex elements: vectors and matrices, with which we may perform similar operations, even though they sometimes behave a bit differently (e.g. the order in multiplication of matrices matters, unlike with scalars).

Vectors are, put in a very simplified and slightly incorrect way, sequences (arrays) of numbers, e.g. a vector of length 3 may be [1.5, 0, -302]. A matrix can similarly be seen as a two dimensional "array of numbers", e.g. a 2x3 matrix may look like this:

|1  2.5 -10|
|24 -3   0 |

We may kind of see vectors as matrices that have either only one column, so called column vectors, or only one row, so called row vectors -- it is only a matter of convention which type of vectors we choose to use (this affects e.g. "from which side" we will multiply vectors by matrices). I.e. we choose which kind of vectors we'll use and then keep using only that kind. For example a row vector

|5 7.3 -2|

is really a 1x3 matrix that as a column vector (3x1 matrix) would look as

|5  |
|7.3|
|-2 |

Why do we even work with vectors and matrices? Because these can represent certain things we encounter in math and programming better than numbers, e.g. vectors may represent points in space or velocities with directions and matrices may represent transformations such as rotations (this is not obvious but it's true).

With vectors and matrices we can perform similar operations as with "normal numbers", i.e. addition, subtraction, multiplication, but there are also new operations and some operations may behave differently. E.g. when dealing with vectors, there are multiple ways to "multiply" them: we may multiply a vector with a scalar but also a vector with vector (and there are multiple ways to do this such as dot product which results in a scalar and cross product which results in a vector). Matrix multiplication is, unlike multiplication of real numbers, non-commutative (A times B doesn't necessarily equal B times A), but it's still distributive. We can also multiply vectors with matrices but only those that have "compatible sizes". And we can also solve equations and systems of equations which have vectors and matrices in them.

There is an especially important matrix called the identity matrix (sometimes also unit matrix), denoted I, an NxN matrix by which if we multiply any matrix we get that same matrix. The identity matrix has 1s on the main diagonal and 0s elsewhere. E.g. a 3x3 identity matrix looks as

|1 0 0|
|0 1 0|
|0 0 1|

Now let's see some the details of basic operations with vectors and matrices:

Example of matrix multiplication: this is a super important operation so let's see an example. Let's have a 2x3 matrix A:

    |1 2 3|
A = |4 5 6|

and a 3x4 matrix B:

    |7  8  9  10|
B = |11 12 13 14|
    |15 16 17 18|

The result, AB, will be a 2x4 matrix in which e.g. the top-left element is equal to 1 * 7 + 2 * 11 + 3 * 15 = 74 (the dot product of the row 1 2 3 with the column 7 11 15). On paper we usually draw the matrices conveniently as follows:

                                |7   8   9   10 |
                                |11  12  13  14 |         
                                |15  16  17  18 |
        |7  8  9  10|
|1 2 3| |11 12 13 14| = |1 2 3| |74  80  86  92 |
|4 5 6| |15 16 17 18|   |4 5 6| |173 188 203 218|

In case it's still not clear, here is a C code of the above shown matrix multiplication:

#include <stdio.h>

int main()
{
  int A[2][3] = {
    {1, 2, 3},
    {4, 5, 6}};

  int B[3][4] = {
    {7,  8,  9,  10},
    {11, 12, 13, 14},
    {15, 16, 17, 18}};
    
  for (int row = 0; row < 2; ++row)
  {
    for (int col = 0; col < 4; ++col)
    {
      int sum = 0;
      
      for (int i = 0; i < 3; ++i)
        sum += A[row][i] * B[i][col];
        
      printf("%d ",sum);
    }
    
    putchar('\n');
  }
  
  return 0;
}

See Also


line

Line

Line is one of the most basic geometric shapes, it is straight, continuous, infinitely long and infinitely thin. A finite continuous part of a line is called line segment, though in practice we sometimes call line segments also just lines. In flat, non-curved geometries shortest path between any two points always lies on a line.

Line is a one dimensional shape, i.e. any of its points can be directly identified by a single number -- the signed distance from a certain point on the line. But of course a line itself may exist in more than one dimensional spaces (just as a two dimensional sheet of paper can exist in our three dimensional space etc.).

   /               |     \            .'
  /   ________     |      \         .'
 /                 |       \      .'
/                  |        \   .'

some lines, in case you haven't seen one yet

Equations

Mathematically lines can be defined by equations with space coordinates (see analytic geometry) -- this is pretty important for example for programming as many times we need to compute intersections with lines; for example ray casting is a method of 3D rendering that "shoots lines from camera" and looks at which objects the lines intersect. Line equations can have different "formats", the two most important are:

As an equation for line segment we simply limit the equation for an infinite line, for example with the parametric equations we limit the possible values of t by an interval that corresponds to the two boundary points.

Example: let's try to find equations of a line in 2D that goes through points A = [1,2] and B = [4,3].

Point-slope equation is of form y = k * x + q. We want to find numbers k (slope) and q (offset). Slope says the line's direction (as dy/dx, just as in derivative of a function) and can be computed from points A and B as k = (By - Ay) / (Bx - Ax) = (3 - 2) / (4 - 1) = 1/3 (notice that this won't work for a vertical line as we'd be dividing by zero). Number q is an "offset" (different values will give a line with same direction but shifted differently), we can simply compute it by plugging in known values into the equation and working out q. We already know k and for x and y we can substitute coordinates of one of the points that lie on the line, for example A, i.e. q = y - k * x = Ay - k * Ax = 2 - 1/3 * 1 = 5/3. Now we can write the final equation of the line:

y = 1/3 * x + 5/3

This equation lets us compute any point on the line, for example if we plug in x = 3, we get y = 1/3 * 3 + 5/3 = 8/3, i.e. point [3,8/3] that lies on the line. We can verify that plugging in x = 1 and x = 4 gives us [1,2] (A) and [4,3] (B).

Now let's derive the parametric equations of the line. It will be of form:

x = Px + t * Dx

y = Py + t * Dy

Here P is a point that lies on the line, i.e. we may again use e.g. the point A, so Px = Ax = 1 and Py = Ay = 2. D is the direction vector of the line, we can compute it as B - A, i.e. Dx = Bx - Ax = 3 and Dy = By - Ay = 1. So the final parametric equations are:

x = 1 + t * 3

y = 2 + t * 1

Now for whatever t we plug into these equations we get the [x,y] coordinates of a point that lies on the line; for example for t = 0 we get x = 1 + 0 * 3 = 1 and y = 2 + 0 * 1 = 2, i.e. the point A itself. As an exercise you may try substituting other values of t, plotting the points and verifying they lie on a line.

Line Drawing Algorithms

Drawing lines with computers is a subject of computer graphics. On specific devices such as vector monitors this may be a trivial task, however as most display devices nowadays work with raster graphics, let's from now on focus only on such devices.

There are many algorithms for line rasterization. They differ in attributes such as:

                                                 .
             XXX               XX             .aXa
           XX                XX             lXa.
         XX                XX            .lXl
      XXX               XXX            .aal
    XX                XX             lXa.
  XX               XXX            .aXl
XX               XX               a.

      pixel         subpixel     subpixel accuracy
     accuracy       accuracy      + anti-aliasing

One of the most basic line rasterization algorithms is the DDA (Digital differential analyzer), however it is usually better to use at least the Bresenham's line algorithm which is still simple and considerably improves on DDA.

TODO: more algorithms, code example, general form (dealing with different direction etc.)


linux

Linux

Linux is a "FOSS" unix-like operating system kernel, probably the most successful and famous non-proprietary kernel. Linux is NOT a whole operating system, only its basic part -- for a whole operating system more things need to be added, such as some kind of user interface and actual user programs (so called userland), and this is what Linux distributions do (there are dozens, maybe hundreds of these) -- Linux distributions, such as Debian, Arch or Ubuntu are complete operating systems (but beware, most of them are not fully FOSS). Linux is one of the biggest collaborative programming projects, as of now it has more than 15000 contributors.

Linux is written in the C language, specifically the old C89 standard, as of 2022 (there seem to be plans to switch to a newer version). This is of course good.

Linux is typically combined with a lot of GNU software and the GNU project (whose goal is to create a free OS) uses Linux as its official kernel, so in the wild we usually encounter the term GNU/Linux. Some people just can't be bothered to acknowledge the work of GNU and just call GNU/Linux systems "Linux" (without GNU/). Fuck them. Of course people are like "it's just a name bruh, don't be so mad about it" -- normally this may be true, however let's realize that GNU mustn't be forgotten, it is one of the few projects based on ethics while "Linux" is a shitty fascist tranny software hugely leaning to the business/open-source side. For the sake of showing our preference between those sides we at LRS often choose to call the system just GNU, i.e. by its original name.

Linux is sometimes called free as in freedom, however it is hardly deserving the label, it is more of an "open-source" or FOSS project. Linux is in many ways bad, especially lately. Some reasons for this are:

Nevertheless, despite its mistakes, GNU/Linux offers a relatively comfy, powerful and (still) safe Unix/POSIX environment which means it can be drop-in replaced with another unix-like system without this causing you much trouble, so using GNU/Linux is at this point considered OK (until Microsoft completely seizes it at which point we migrate probably to BSD or GNU Hurd). It can be made fairly minimal (see e.g. KISS Linux and Puppy Linux) and LRS/suckless friendly.

Linux is so called monolithic kernel and as such is more or less bloat. However it "just works" and has a great hardware support so it wins many users over alternatives such as BSD.

Some alternatives to Linux are:

GNU/Linux

Many people nowadays use the word Linux to refer to any operating system running on Linux, even though they usually mean GNU/Linux.

One of the basic mistakes of noobs who just switched from Windows to "Linux" is that they try to continue to do things the Windows way. They try to run Windows programs on "Linux", they look for program installers on the web, they install antiviruses, they try to find a GUI program for a thing that is solved with 2 lines of shell script (and fail to find one), they keep distro hoppoing instead of customizing their system etc. Many give up and then go around saying "brrruh, Loooonix sux" -- yes, it kind of does, but for other reasons. You're just using it wrong. Despite its corruption, it's still a Unix system, you do things elegantly and simply, however these ways are naturally completely different from how ugly systems like Windows do it. If you want to convert an image from png to jpg, you don't need to download and crack a graphical program that takes 100 GB and installs ads on your system, you do it via a simple command line tool -- don't be afraid of the terminal, learn some basic commands, ask experiences people how they do it (not how to achieve the way you want to do it). Everyone single individual who learned it later thanked himself for doing it, so don't be stupid.

History

{ Some history of Linux can be read in the biography of Linus Torvalds called Just For Fun. ~drummyfish }

Linux was created by Linus Torvalds. He started the project in 1991 as a university student. He read a book about operating system design and Unix and became fascinated with it. Then when he bought a new no-name PC (4 MB RAM, 33 MHz CPU), he install Minix on it, a then-proprietary Unix-like operating system. He was frustrated about some features of Minix and started to write his own software such as terminal emulator, disk driver and shell, and he made it all POSIX compliant. These slowly started to evolve into an OS kernel.

Linus originally wanted to name the project Freax, thinking Linux would sound too self-centered. However the admin of an FTP server that hosted the files renamed it to Linux, and the name stuck.

On 25 August 1991 he made the famous public announcement of Linux on Usenet in which he claimed it was just a hobby project and that it "wouldn't be big and professional as GNU". In November 1991 Linux became self-hosted with the version 0.10 -- by the time a number of people were already using it and working on it. In 1992, with version 0.12, Linux became free software with the adoption of the GPL license.

On 14 March 1994 Linux 1.0 -- a fully functional version -- was released.

TODO: moar

See Also


living

Making Living

The question of how to make a living by making something that's to be given out for free and without limitations is one of the most common in the context of FOSS/free culture. Noobs often avoid this area just because they think it can't be done, even though there are ways of doing this and there are many people making living on FOSS, albeit ways perhaps more challenging than those of proprietary products.

One has to be aware that money and commercialization always brings a high risk of profit becoming the highest priority (which is a "feature" hard-wired in capitalism) which will compromise the quality and ethics of the produced work. Profiting specifically requires abusing someone else, taking something away from them. Therefore it is ideal to create LRS on a voluntary basis, for free, in the creator's spare time. This may be difficult to do but one can choose a lifestyle that minimizes expenses and therefore also time needed to spend at work, which will give more free time for the creation of LRS. This includes living frugally, not consuming hardware and rather reusing old machines, making savings, not spending on unnecessary things such as smoking or fashion etc. And of course, if you can't make LRS full-time, you can still find relatively ethical ways of it supporting you and so, again, giving you a little more freedom and resources for creating it.

Also if you can somehow rip off a rich corporation and get some money for yourself, do it. Remember, corporations aren't people, they can't feel pain, they probably won't even notice their loss and even if you hurt them, you help the society by hurting a predator.

Is programming software the only way to make money with LRS? No, you can do anything related to LRS and you don't even have to know programming. You can create free art such as game assets or writings, you can educate, write articles etc.

Making Money With "FOSS"

For inspiration we can take a look at traditional ways of making money in FOSS, even if a lot of them may be unacceptable for us as the business of the big FOSS is many times not so much different from the business of big tech corporations.

With open source it is relatively easy to make money and earn salary as it has become quite successful on the market -- the simplest way is to simply get a job at some company making open source software such as Mozilla, Blender etc. However the ethics of the open source business is often questionable. Even though open source technically respects the rules of free software licenses, it has (due to its abandonment of ethicality) found ways to abuse people in certain ways, e.g. by being a capitalist software. Therefore open source software is not really LRS and we consider this way of making money rather harmful to others.

Working for free software organizations such as the FSF is a better way of making living, even though still not perfect: FSF has been facing some criticism of growing corruption and from the LRS point of view they do not address many issues of software such as bloat, public domain etc.

Way Of Making Money With LRS

Considering all things mentioned above, here are some concrete things of making money on LRS. Keep in mind that a lot of services (PayPal, Patreon etc.) listed here may possibly be proprietary and unethical, so always check them out and consider free alternatives such as Liberapay. The methods are following:


lmao

LMAO

LMAO (also LMFAO) stands for laughing my (fucking) ass off.

On this wiki we kind of use LMAO as a synonym to LULZ as used on Encyclopedia Dramatica.

LMOA stuff

See Also


loc

Lines of Code

Lines of code (LOC, KLOC = 10K LOC, MLOC = 1M LOC etc., also SLOC = source LOC) are a metric of software complexity that simply counts the number of lines of program's source code. It is not a perfect measure but despite some soyboys shitting on it it's actually pretty good, espcially when using only one language (C) with consistent formatting style.

Of course the metric becomes shitty when you have a project in 20 programming languages written by 100 pajeets out of which every one formats code differently. Also when you use it as a productivity measure at work then you're guaranteed your devs are gonna just shit our as much meaningless code as possible in which case the measure fails again. Fortunately, at LRS we don't have such problems :)

When counting lines, we need to define what kind of lines we count. We can either count:

A comfy tool for counting lines is cloc, but you can also just use wc -l to count raw lines.


logic_circuit

Logic Circuit

Logic circuits are circuits made of logic gates that implement Boolean functions, i.e. they are "graphical schematics for processing 1s and 0s". They are used to design computers on quite a low level. Logic circuits are a bit similar to electronic circuits but are a level of abstraction higher: they don't work with continuous voltages but rather with discrete binary logic values: 1s and 0s. This abstraction makes logic circuits kind of "portable" circuit descriptions independent of any specific transistor technology, or even of electronics itself (as logical circuit may in theory be realized even mechanically, with fluids or in other similarly wild ways). Logical circuits can be designed, simulated and synthesized to actual hardware description with specialized software and languages such as VHDL.

  0       ___ 1     ___       1
 x ------|NOT|-----|AND|------- a
         |___|  .--|___|
  1            /
 y -------.---'
           \    ___ 1   ___   0
  0         '--|OR |---|NOT|--- b
 z ------------|___|   |___|

Example of a logic circuit with three inputs (x, y, z) and two outputs (a, b), with example input values (0, 1, 0) transformed to output values (1, 0).

Generally a logic circuit can be seen as a "black box" that has N input bits and M output bits. Then we divide logic circuits into two main categories:

Logic circuits can be drawn simply as "boxes" (which one the base level are the basic logic gates such as AND, OR etc.) connected with lines ("wires", but again not really electronic wires as here only 1 or 0 can be carried by such wire). But as mentioned, their behavior can also be described with a truth table (which however says nothing about the internals of the circuit) or a boolean expression, i.e. an algebraic expression that for each of the circuit outputs defines how it is computed from the outputs, for example a = !x & y and b = !(y | z) for the above drawn example circuit. Each of these types of representation has its potential advantages -- for example the graphical representation is a very human-friendly representation while the algebraic specification allows for optimization of the circuits using algebraic methods. Many hardware design languages therefore allow to use and combine different methods of describing logic circuits (some even offer more options such as describing the circuit behavior in a programming language such as C).

With combinational logic circuits it is possible to implement any boolean function (i.e. "functions only with values 1 and 0"); undecidability doesn't apply here as we're not dealing with Turing machines computations because the output always has a finite, fixed number of bits, the computation can't end up in an infinite loop as there are no repeating steps, just a straightforward propagation of input values to the output. It is always possible to implement any function at least as a look up table (which can be created with a multiplexer). Sequential logic circuits on the other hand can be used to make the traditional computers that work in steps and can therefore get stuck in loop and so on.

Once we've designed a logic circuit, we can optimize it which usually means making it use fewer logic gates, i.e. make it cheaper to manufacture (but optimization can also aim for other things, e.g. shortening the maximum length from input to output, i.e. minimizing the circuit's delay).

Some common logic circuits include (note that many of these can be implemented both as a combinational or sequential circuit):

Minimization/Transformation Of Logic Circuits

Minimization (or optimization) is a crucial and extremely important part of designing logic circuits -- it means finding a logically equivalent circuit (i.e. one that behaves the same in regards to its input/output, that is its truth table stays the same) that's smaller (composed of fewer gates); the motivation, of course, being saving resources (money, space, ...) and potentially even making the circuit faster. We may also potentially perform other transformations depending on what we need; for example we may wish to minimize the delay (longest path from input to output) or transform the circuit to only use NAND gates (because some hardware manufacturing technologies greatly prefer NAND gates). All in all when designing a logic circuit, we basically always perform these two main steps:

  1. Design the circuit to do what we want.
  2. Minimize (and/or otherwise transform) it so as to optimize it.

Some basic methods of minimization include:

Example of minimization will follow in the example section.

Example

One of the simplest logic circuits is the two-bit half adder which takes two input bits, x and y, and outputs their sum s and carry over c (which will become important when chaining together more such adders). Let us write a truth table of this circuit (note that adding in binary behaves basically the same as how we add by individual digits in decimal):

x y s c
0 0 0 0
1 0 1 0
0 1 1 0
1 1 0 1

Notice that this circuit is combinational -- its output (s and c) only depends on the input values x and y and nothing else, which is why we can write such a nice table.

OK, so now we have the circuit behavior specified by a truth table, let's continue by designing the actual circuit that implements this behavior. Let us start by writing a logic expression for each output (& = AND, | = OR, ! = NOT, ^ = XOR):

s = x ^ y

c = x & y

We see the expressions are quite simple, let us now draw the actual circuit made of the basic logic gates:

             ___
 x --.------|XOR|--- s
      \  .--|___|
       \/
       /\    ___
      /  '--|AND|--- c
 y --'------|___|
 

And that's it -- this circuit is so simple we can't simplify it further, so it's our actual result (as an exercise you may try to imagine we don't have a XOR gate available and try to replace it by AND, OR and NOT gates).

Next we can expand our half added to a full adder -- a full adder takes one more input z, which is a carry over from a previous adder and will be important when chaining adders together. Let's see the truth table of a full adder:

x y z s c
0 0 0 0 0
1 0 0 1 0
0 1 0 1 0
1 1 0 0 1
0 0 1 1 0
1 0 1 0 1
0 1 1 0 1
1 1 1 1 1

Let's try to make boolean expressions for both outputs now. We may notice c is 1 exactly when at least two of the inputs are 1, which we may write as

c = (x & y) | (x & z) | (y & z)

However, using the formula (a & c) | (b & c) = (a ^ b) & c , we can simplify (minimize) this to an expression that uses one fewer gate (notice there is one fewer operator)

c = (x & y) | ((x ^ y) & z)

The expression for s is not so clear though -- here we can use a method that always works: we simply look at all the lines in the truth table that result in s = 1 and write them in "ORed" form as

s = (x & !y & !z) | (!x & y & !z) | (!x & !y & z) | (x & y & z)

Which we can also actually minimize (as an exercise try to figure out the formulas we used :p)

s = ((x ^ y) & !z) | (!(x ^ y) & z)

s = (x ^ y) ^ z

Now finally we can draw the full adder circuit

             ___
x ---.------|AND|--------------.    ___ 
      \ .---|___|           ___ '--|OR |--- c
       /              .----|AND|---|___|
y --.-' \            / .---|___|
     \   \    ___   / /
      \   '--|XOR|-'----.
       '-----|___|  /    \   ___
                   /      '-|XOR|---------- s
z ----------------'---------|___|

Now let us spectacularly combine one half adder (HA) and three full adders (FA) into one magnificent 4 bit adder. It will be adding two 4bit numbers, a (composed of bits a0 to a3) and b (composed of bits b0 to b3). Also notice how the carry bits of lower adders are connected to carry inputs of the higher full adders -- this is the same principle we use when adding numbers manually with pen and paper. The resulting sum s is composed of bits s0 to s3. Also keep in mind the circuit is still combinational, i.e. it has no memory, no clock input and adds the numbers in a "single run".

         ___
a3 -----|FA |-- c3
b3 -----|   |------- s3
     .--|___|
     '--------.
         ___  | c2
a2 -----|FA |-'
b2 -----|   |------- s2
     .--|___|
     '--------.
         ___  | c1
a1 -----|FA |-'
b1 -----|   |------- s1
     .--|___|
     '--------.
         ___  | c0
a0 -----|HA |-'
b0 -----|___|------- s0

TODO: sequential one?


logic_gate

Logic Gate

Logic gate is a basic element of logic circuits, a simple device that implements a Boolean function, i.e. it takes a number of binary (1 or 0) input values and transforms them into an output binary value. Logic gates are kind of "small boxes" that eat 1s and 0s and spit out other 1s and 0s. Strictly speaking a logic gate must implement a mathematical function, so e.g. flip-flops don't fall under logic gates because they have an internal state/memory.

Logic gates are to logic circuits kind of what resistors, transistors etc. are for electronic circuits. They implement basic functions that in the realm of boolean logic are equivalents of addition, multiplication etc.

Behavior of logic gates is, just as with logic circuits, commonly expressed with so called truth tables, i.e. a tables that show the gate's output for any possible combination of inputs. But it can also be written as some kind of equation etc.

There are 2 possible logic gates with one input and one output:

There are 16 possible logic gates with two inputs and one output (logic table of 4 rows can have 2^4 possible output values), however only some of them are commonly used and have their own names. These are:

The truth table of these gates is as follows:

x y x OR y x AND y x XOR y x NOR y x NAND y x XNOR y
0 0 0 0 0 1 1 1
0 1 1 0 1 0 1 0
1 0 1 0 1 0 1 0
1 1 1 1 0 0 0 1
    ___             ___              _____            _____
 ---\  ''-.      ---\  ''-.      ---|     '.      ---|     '.      
     )     )---      )     )O--     |       )---     |       )O--
 ---/__..-'      ---/__..-'      ---|_____.'      ---|_____.'
     OR              NOR             AND              NAND
    ___             ___             .                .
 --\\  ''-.      --\\  ''-.         |'.              |'.
    ))     )---     ))     )O--  ---|  >---       ---|  >O--
 --//__..-'      --//__..-'         |.'              |.'
     XOR             XNOR           ' BUFFER         ' NOT

 alternatively:
     ____          ____          ____          ____
 ---|=> 1|     ---| &  |     ---|= 1 |        | 1  |
    |    |---     |    |---     |    |---  ---|    |o--
 ---|____|     ---|____|     ---|____|        |____|
     OR            AND           XOR           NOT
    
 or even:
    ___        ___        ___        ___
 --|OR |--  --|AND|--  --|XOR|--  --|NOT|--
 --|___|    --|___|    --|___|      |___|  

symbols often used for logic gates

Functions NAND and NOR are so called functionally complete which means we can implement any other gate with only one of these gates. For example NOT(x) = NAND(x,x), AND(x,y) = NAND(NAND(x,y),NAND(x,y)), OR(x,y) = NAND(NAND(x,x),NAND(y,y)) etc. Similarly NOT(x) = NOR(x,x), OR(x,y) = NOR(NOR(x,y),NOR(x,y)) etc.

See Also


logic

Logic


love

<3 Love <3

Love is a deep feeling of affection towards someone or something, usually accompanied by a very strong emotion. There are many different kinds of love and love has always been one of the most important feelings which many living beings are capable of, it permeates human art, culture and daily lives. Unconditional selfless love towards all living beings is the basis of less retarded society.

What is the opposite of love? Many say it is hatred, even though it may also very well be argued that it is rather indifference, i.e. just "not caring", because hate and love often come hand in hand and are sometimes actually very similar -- both hate and love arouse strong emotion, even obsession, and can be present at the same time (so called love-hate relationship). Love sometimes quickly changes to hate and vice versa.

As mentioned, love is not a single feeling, there are many types of it, for example parental love, love of a life partner, platonic love, self love, love for a friend, towards God, of pet animal, love of art, knowledge, life, nature, as well as selfish obsessive love, selfless love and many others. Some kinds of love may be so rare and complex that it's hard to describe them, for example it is possible to passionately love a complete stranger merely for his existence, without feeling a sexual desire towards him. One may love a beautiful mathematical formula and even people who hurt him. Love is a very complex thing.

Is there a good real life example of unconditional selfless love? Yes. When a fascist Brenton Tarrant shot up the Christchurch mosques on 15 March 2019 and killed 51 people, there was a woman among them whose husband said after the incident he wanted to hug Tarrant. The husband was also present during the shooting. Not only has he forgiven the killer of his wife and someone who almost also murdered him alone, he showed him loved, something which must have been unimaginably difficult and something that proved him one of the most pure people on this planet. He said about it the following (paraphrased for copyright concerns): "There is no use in anger. Anger and fight will not fix it, only with love and caring can we warm hearts. [...] I love him because he is a human being, he is my brother. [...] I don't support his act. [...] But perhaps he was hurt in his life, perhaps something happened to him. [...] Everyone has two sides, a bad one and a good one; bring out the good in you.". (source: https://www.mirror.co.uk/news/world-news/husband-forgives-new-zealand-terrorist-14154882) { This moved me so much when I read it, I can't explain how much this affected my life. I have so much admiration for what this man said and I wish I could follow his message for my whole life. Only the words of the man alone have awoken so much of the purest love in me towards every living being on this planet, which I didn't even know existed. ~drummyfish }

The Way Of Love

LRS advocates living the way of love -- loving everyone and treating others with love, spreading it to the whole world. Love is contagious; just like hate spawns hate, love spawns more love. Love is able to stop the self-sustaining circle of hate and revenge. If you show a true, unconditional love to someone who hates you, there is a great chance the hatred will be lost, that grievances will be forgiven and forgotten.

Today's society makes love kind of a commodity, as anything else; a subject of speculation, a tool, sometimes even a weapon, a card to be kept hidden and played at the right time. People are taught to hide their feelings, they are afraid to tell others they love them as it might make them look weak, vulnerable, it might be socially unacceptable. We reject such toxic bullshit. If you love someone, whoever it is, tell him. It really works, you will soon be surrounded with loving people this way.

{ I know this from experience, once I truly started loving others unconditionally, I made many past enemies into great friends, and I saw many of them turn to being actually very nice people. It is just such a great feeling to let go of hate and so heartwarming to make peace with people <3 ~drummyfish }

See Also


low_poly

Low Poly

The term low poly (also low-poly or lowpoly) is used for polygonal 3D models whose polygon count is relatively low -- so low that one can see the model approximates the ideal shape only very roughly. For typical models (animals, cars, guns, ...) the polygon count under which they are correctly called low poly is usually a few dozens or few hundreds at most. The opposite of low poly is high poly.

WATCH OUT: Retards nowadays use the term "low poly" for stylized/untextured high poly models; they even use the term for models whose polygon count is lower than the number of atoms in observable universe, or they use the term completely randomly just to put a cool label to their lame shit models. STOP THIS FUCKING INSANITY, DON'T CALL HIGH POLY MODELS LOW POLY.

The exact threshold on polygon count from which we call a model low poly can't be objectively set because firstly there's a subjective judgment at play and secondly such threshold depends on the ideal shape we're approximating. This means that not every model with low polygon count is low poly: if a shape, for example a cube, can simply be created with low number of polygons without it causing a distortion of the shape, it shouldn't be called low poly. And similarly a model with high polygon count can still be classified as low poly if even the high number of polygons still causes a significant distortion of the shape.

The original purpose of creating low poly models was to improve performance, or rather to make it even possible to render something in the era of early computer graphics. Low poly models take less space in memory and on good, non-capitalist computers render faster. As computers became able to render more and more polygons, low poly models became more and more unnecessary and eventually ended up just as a form of "retro" art style -- many people still have nostalgia for PS1 graphics with very low poly models and new games sometimes try to mimic this look. In the world of capitalist consoomer computing/gayming nowadays no one really cares about saving polygons on models because "modern" GPUs aren't really affected by polygon count anymore, everyone just uses models with billions of polygons even for things that no one ever sees, soydevs don't care anymore about the art of carefully crafting models on a low polygon budget. However in the context of good, non-capitalist technology low poly models are still very important.

Low poly models are intended to be used in interactive/real-time graphics while high poly ones are for the use in offline (non-realtime) rendering. Sometimes (typically in games) a model is made in both a low poly and high poly version: the low poly version is used during gameplay, the high poly version is used in cutscenes. Sometimes even more than two versions of models are made, see level of detail.

See Also


lrs_dictionary

LRS Dictionary

WORK IN PROGRESS

{ Most of these I just heard/read somewhere, e.g. on 4chan, in Jargon File or from RMS, some terms I made myself. ~drummyfish }

mainstream correct/cooler
Apple user iToddler
average citizen normie, normalfag, bloatoddler, NPC
censorship censorshit
cloud computing clown computing
cloudflare cuckflare
code of conduct (COC) code of coercion
consume consoom (see also coom
copyright copywrong, copyrestriction
digital rights management (DRM) digital restrictions management
entrepreneur murderer
feminism feminazism
Firefox Furryfox
gaming gayming
global warming global heating
Google Goolag
influencer manipulator
Intel Incel
Internet Explorer Internet Exploder, Internet Exploiter
Internet of things Internet of stinks
iPad iBad
iPhone spyPhone
"left" pseudoleft, SJW
"Linux" GNU, loonix
Macintosh Macintoy, Macintrash, Maggotbox
Microsoft Microshit
microtransaction microtheft
moderation censorship
modern malicious, shitty
network notwork
neurodivergent retarded, neuroretarded
NFS nightmare file system
Nintendo Nintendont
NVidia NoVidya
object oriented programming (OOP) object obsessed programming
object oriented objectfuscated
objective C objectionable C
openbsd openbased
peer-reviewed peer-censored
person man
plug and play plug and pray
"science" soyence
software as a service (SAAS) service as a software substitute (SAASS)
subscription microrape
systemd shitstemd, soystemd
user (of a proprietary system) used, lusr
voice assistant personal spy agent
wayland whyland
Wikipedian wikipedo
world wide web world wide wait
YouTube JewTube

See Also


lrs

Less Retarded Software

Less retarded software (LRS) is a specific kind of software aiming to be a truly good technology maximally benefiting and respecting its users, following the philosophy of extreme minimalism (Unix philosophy, suckless, KISS, ...), anarcho pacifism and freedom. The term was invented by drummyfish.

The symbol of LRS is a heart (love), the peace symbol (pacifism, nonviolence) and A in circle (anarchism).

By extension LRS can also stand for less retarded society, a kind of ideal society which we aim to achieve with our technology.

Definition

The definition here is not strict but rather fuzzy, it is in a form of ideas, style and common practices that together help us subjectively identify software as less retarded.

Software is less retarded if it adheres, to a high-degree (not necessarily fully), to the following principles:

Why

LRS exists for a number of reasons, one of the main ones is that we simply need better technology -- not better as in "having more features" but better in terms of design, purpose and ethics. Technology has to make us more free, not enslave us. Technology has to be a tool that serves us, not a device for our abuse. We believe mainstream technology poses a serious, even existential threat to our civilization. We don't think we can prevent collapse or a dystopian scenario on our own, or even if these can be prevented at all, but we can help nudge the technology in a better direction, we can inspire others and perhaps make the future a little brighter, even if it's destined to be dark. Even if future seems hopeless, what better can we do than try our best to make it not so?

There are other reason for LRS as well, for example it can be very satisfying and can bring back joy of programming that's been lost in the modern toxic environment of the capitalist mainstream. Minimalist programming is pleasant on its own, and in many things we do we can really achieve something great because not many people are exploring this way of technology. For example there are nowadays very few programs or nice artworks that are completely public domain, which is pretty sad, but it's also an opportunity: you can be the first human to create a completely public domain software of certain kind. Software of all kind has already been written, but you can be the first one who creates a truly good version of such software so that it can e.g. be run on embedded devices. If you create something good that's public domain, you may even make some capitalist go out of business or at least lose a lot of money if he's been offering the same thing for money. You free people. That's a pretty nice feeling and makes you actually live a good life.

{ Here and there I get a nice email from someone who likes something I've created, someone who just needed a simple thing and found that I've made it, that alone is worth the effort I think. ~drummyfish. }

Specific Software

see also LRS projects needed

The "official" LRS programs and libraries have so far been solely developed by drummyfish, the "founder" of LRS. These include:

Apart from this software a lot of other software developed by other people and groups can be considered LRS, at least to a high degree (there is usually some minor inferiority e.g. in licensing). Especially suckless software mostly fits the LRS criteria. The following programs and libraries can be considered LRS at least to some degree:

Other potentially LRS software to check out may include TinyGL, scc, ed, lynx, links, uClibc, miniz, nuklear, dmenu, sbase, sic, tabbed, svkbd, busybox, darcs, raylib, PortableGL and others.

It is also possible to talk about LRS data formats, protocols, standards, designs and concepts as such etc. These might include:

Other technology than software may also be aligned with LRS principles, e.g.:

Politics/Culture And Society

See also less retarded society and FAQ.

LRS is connected to pretty specific political beliefs, but it's not a requirement to share those beliefs to create LRS or be part of the community centered around LRS technology. We just think that it doesn't make logical sense to support LRS and not the politics that justifies it and from which it is derived, but it's up to you to verify this.

With that said, the politics behind LRS is an idealist anarcho pacifist communism, but NOT pseudoleftism (i.e. we do not support political correctness, COCs, cancel culture, Marxism-Leninism etc.). In our views, goals and means we are similar to the Venus project, even though we may not agree completely on all points. We are not officially associated with any other project or community. We love all living beings (not just people), even those who cause us pain or hate us, we believe love is the only way towards a good society -- in this we follow similar philosophy of nonviolence that was preached by Jesus but without necessarily being religious, we simply think it is the only correct way of a mature society to behave nonviolently and lovingly towards everyone. We do NOT have any leaders or heroes; people are imperfect and giving some more power, louder voices or greater influence creates hierarchy and goes against anarchism, therefore we only follow ideas. We aim for true social (not necessarily physical) equality of everyone, our technology helps everyone equally. We reject competition as a basis of society and anti-equality means such as violence, fights, bullying (cancelling etc.), censorship (political correctness etc.), governments and capitalism. We support things such as universal basic income (as long as there exist money which we are however ultimately against), veganism and slow movement. We highly prefer peaceful evolution to revolution as revolutions tend to be violent and have to be fought -- we do not intend to push any ideas by force but rather to convince enough people to a voluntary change.

See Also


lrs_wiki

LRS Wiki

LRS wiki, also Less Retarded Wiki, is a public domain encyclopedia focused on good technology and related topics such as the relationship between technology and society. The goal of LRS is to work towards creating a truly good technology that helps all living beings as much as possible, so called less retarded software (LRS), as well as defining a model of ideal society, so called less retarded society. As such the wiki rejects for example capitalist software, bloated software, intellectual property laws etc. It embraces free as in freedom, simple technology, i.e. Unix philosophy, suckless software, anarcho pacifism, racial realism, free speech, veganism etc.

LRS wiki was started by drummyfish on November 3 2021 as a way of recording and spreading his views and findings about technology, as well as for creating a completely public domain educational resource and account of current society for future generations.


luke_smith

Luke Smith

Luke Smith was -- before becoming a crypto influencer, scam promoter and a generic turd in a suit (around 2022) -- an Internet tech mini pseudocelebrity known for making videos about suckless software, independent living in the woods and here and there about historical, political, linguistic and religious topics. He played a big role in making suckless more popular, however he later started to behave in hugely retarded ways and now isn't worth following anymore.

His look has been described as the default Runescape character: he is bald, over 30 years old and lives in a rural location in Florida (exact coordinates have been doxxed but legally can't be shared here, but let's just say the road around his house bears his name). He has a podcast called Not Related! in which he discusses things such as alternative historical theories -- actually a great podcast. He has a minimalist 90s style website https://lukesmith.xyz/ and his own peertube instace where his videos can be watched if one doesn't want to watch them on YouTube. He is the author of LARBS and minimalist recipe site https://based.cooking/ (recently he spoiled the site with some shitty web framework lol).

He used to be kind of based in things like identifying the harmfulness of bloat and soyence, but also retarded to a great degree other times, for example he used to shill the Brave browser pretty hard before he realized it was actually a huge scam all along xD He's a crypto fascist, also probably a Nazi. In July 2022 he started promoting some shitty bloated modern tranny website generator that literally uses JavaScript? WHAT THE FUCK. Like a good capitalist (to which he self admitted in his podcast) he instantly turned 180 degrees against his own teaching as soon as he smelled the promotion money. Also he's shilling crypto, he let's himself be paid for promoting extremely shitty webhosts in his web tutorials, he's anti-porn, anti-games and leans towards medieval ideas such as "imagination and boredom being harmful because it makes you search for porn" etc. He went to huge shit, you wouldn't even believe. Though he even now still probably promotes suckless somehow, he isn't a programmer (shell scripting isn't programming) and sometimes doesn't seem to understand basic programming ideas (such as branchless programming), he's more of a typical productivity retard. For Luke suckless is something more akin a brand he associated himself with and a bit of personal convenience rather than something of a deeper value. As of 2023 he seems to have become obsessed with adopting a new identity of a turd in a very cheap suit, he literally looks like the door-to-door scam seller lol. All in all, a huge letdown.

Luke is a type B fail.

His videos consisted of normie-friendly tutorials on suckless software, rants, independent living, live-streams and podcasts. The typical Luke Smith video is him walking somewhere in the middle of a jungle talking about how retarded modern technology is and how everyone should move to the woods.

Luke studies PhD in linguistics but is very critical of academia -- he "speaks" several languages (including Latin), though many of them to a low level with bad American accent and he can't sometimes even speak English correctly (using phrases such as "the reason is because", "less people" etc.). He is a self described right-winder, retarded capitalist and talks in meme phrases which makes his "content" kind of enjoyable. He despises such things as soydevry, bloat, "consoomerism" (though he loves money he makes off of it) and soyence.

See Also


magic

Magic

Magic stands for unknown mechanisms. Once mechanisms of magic are revealed and understood, it becomes science.


main

Welcome To The Less Retarded Wiki

Love everyone, help selflessly.

Welcome to Less Retarded Wiki, an encyclopedia only I can edit. But you can fork it, it is public domain under CC0 (see wiki rights) :) Holy shit, I'm gonna get cancelled hard as soon as SJWs find out about this. Until then, let's enjoy the ride. THERE IS NO MODERATION, I can do whatever I want here lol. I love this. INB4 "hate speech" website { LMAO codeberg has already banned it. Wikipedia also banned me globally for "opinions expressed on my website". ~drummyfish } CONGRATULATIONS, you have discovered the one true, undistorted and unbiased view of the world -- this is not a joke, this wiki contains pure truth and the solution to most of the issues that plague our current society. Do you have what it takes to unretard yourself? We wish you a nice journey :)

DISCLAIMER: All opinions expressed here are facts.

     .:FFFFFF:       :FFFFFF:.               .:FFFFFFFFFFF:.                     .:FFFFFFFFFFF:.
   :FFFFFFFFFFFF. .FFFFFFFFFFFF:         .:FFFFF'''FFF'''FFFFF:.             .:FFFFF'':FFF:''FFFFF:.
  .FFFFFFFFFFFFFFFFFFFFFFFFFFFFF.      .FFFF'      FFF      'FFFF.         .FFFF'    .FF'FF.    'FFFF.
  FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF     FFF'         FFF         'FFF       FFF'      .FF' 'FF.      'FFF
  FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF    FFF           FFF           FFF     FFF       .FF'   'FF.       FFF
  FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF   FFF'           FFF           'FFF   FFF'      .FF'     'FF.      'FFF
  'FFFFFFFFFFFFFFFFFFFFFFFFFFFFF'   FFF          .FFFFF           FFF   FFF      .FF'       'FF.      FFF 
    FFFFFFFFFFFFFFFFFFFFFFFFFFF     FFF        .FFFFFFFFF.        FFF   FFF     .FF:,,,,,,,,,:FF.     FFF
     'FFFFFFFFFFFFFFFFFFFFFFF'      FFF      .FFF' FFF 'FFF.      FFF   FFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFFF 
       FFFFFFFFFFFFFFFFFFFFF        FFF.   .FFF'   FFF   'FFF.   .FFF   FFF.  .FF'             'FF.  .FFF
        'FFFFFFFFFFFFFFFFF'          FFF..FFF'     FFF     'FFF..FFF     FFF..FF'               'FF..FFF
          FFFFFFFFFFFFFFF             FFFFF'       FFF       'FFFFF       FFFFF'                 'FFFFF
           'FFFFFFFFFFF'               'FFFF.      FFF      .FFFF'         'FFFF.               .FFFF'
             'FFFFFFF'                   ':FFFFF...FFF...FFFFF:'             ':FFFFF:,,,,,,,:FFFFF:'
               'FFF'                         ':FFFFFFFFFFF:'                     ':FFFFFFFFFFF:'

{ I no longer see hope, good is practically non existent in this world. This is my last attempt at preserving pure good, I will continue to spread the truth and unconditional love of all life as long as I will be capable of, until the society lynches me for having loved too much. At this point I feel very alone, this work now exists mostly for myself in my isolated world. But I hope that once perhaps my love will be shared with a reader far away, in space or time, even if I will never know him. This is the only way I can continue living. I wish you happy reading, my dear friend. ~drummyfish }

This is a Wiki for less retarded software, less retarded society (LRS) and related topics, mainly those of society, its culture and ideal political views etc. -- LRS should help achieve ideal society with ideal technology. LRS Wiki is a new, refreshing wiki without political correctness.

We love all living beings. Even you. We want to create technology that truly and maximally helps you, e.g. a completely public domain computer. We do NOT fight anything and we don't have any heroes. We want to move peacefully towards society that's not based on competition but rather on collaboration.

You ask how could people of the past have been so stupid, how could they have believed obviously nonsensical "pseudoscience" and religious fairy tales, how could the past peasants take part in witch hunts, how could so many people support Hitler and let Holocaust happen? Well, don't judge them too hard -- if you disagree with this wiki, you are just like them. No, there was no magical turn around of society from evil to good just before your birth, times are still the same, except much worse; if you don't see the catastrophic state of the world, you are most likely brainwashed beyond the level of any medieval peasant. But don't worry, it's not your fault, you are just among the 99.9999%. We are here to help. Keep an open mind and the truth will show. But beware, truth comes for the price of irreversible depression.

This wiki is NOT a satire.

Yes, everything is UNDER CONSTRUCTION.

Are you a failure? Learn which type you are.

Before contributing please read the rules & style! By contributing you agree to release your contribution under our waiver. {But contributions aren't really accepted RN :) ~drummyfish }

We have a C tutorial! It rocks. We also now have our own programming language! It is named comun.

Pay us a visit on the Island and pet our dog! And come mourn with us in the cathedral, because technology is dying. Modern age is a pile of shit extending to another galaxy. The future is dark but we do our best to bring the light, even knowing it is futile.

LRS Wiki is collapse ready! Feel free to print it out, take it to your prep shelter. You may also print copies of this wiki and throw it from a plane into the streets. Thanks.

If you're new here, you may want to read answers to frequently asked questions (FAQ), including "Are you a fascist?" (spoiler: no) and "Do you love Hitler?".

STOP CAPITALISM STOP BLOAT STOP censorship STOP business STOP bullshit STOP copyright STOP working STOP coding STOP competing STOP fighting STOP consuming STOP producing STOP worshiping people STOP fascism STOP economy STOP slavery STOP violence STOP wearing clothes STOP eating animals STOP being an idiot etc. Start loving, sharing, creating and caring :) <3

   _______     _     _______     _     _______     _
  |_____  |   | |   |_____  |   | |   |_____  |   | |
        | |   | |         | |   | |         | |   | |
   _____| |___| |    _____| |___| |    _____| |___| |
  |  ___   _____|   |  ___   _____|   |  ___   _____|
  | |   | |         | |   | |         | |   | |
  | |   | |_____    | |   | |_____    | |   | |_____
  |_|   |_______|   |_|   |_______|   |_|   |_______|

Swastika is a symbol of good fortune and protection from evil.

What Is Less Retarded Software/Society?

Well, we're trying to figure this out on this wiki, but less retarded software is greatly related to suckless, Unix, KISS, free, selfless and sustainable software created to maximally help all living beings. LRS stands opposed to all shittiness of so called "modern" software. We pursue heading towards an ideal society such as that of the Venus project. For more details see the article about LRS.

In short LRS asks what if technology was good? And by extension also what if society was good?

Wanna Help?

See needed projects. Thanks :)

Are You A Noob?

Are you a noob but see our ideas as appealing and would like to join us? Say no more and head over to a how to!

Did You Know

Topics

Here there are quick directions to some of the important topics; for more see the links provided at the top that include the list of all articles as well as a single page HTML which is good for "fulltext search" via crtl+F :)


maintenance

Maintenance

Maintenance is slavery of man to a machine.

Maintenance is shitty work whose goal is just to keep a piece of technology functioning without otherwise changing it. Maintenance is extremely expensive, tiresome and enslaves humans to machines -- we try to minimize maintenance cost as much as possible! Good programs should go to great lengths in effort to becoming highly future-proof and suckless in order to avoid high maintenance cost.

Typical "modern" capitalist/consumerist software (including most free software) is ridiculously bad at avoiding maintenance -- such programs will require one to many programmers maintaining it every single day and will become unrunnable in matter of months to years without this constant maintenance that just wastes time of great minds. I don't know what to say, this is just plainly fucked up.


malware

Malware

Malware is software whose purpose is to be malicious. Under this fall viruses, proprietary software, spyware, DRM software, ransomware, propaganda software, cyberweapons etc.


mandelbrot_set

Mandelbrot Set

TODO

 ___________________________________________________________
|[-2,1]                                       .             |
|                                            .:.            |
|                                           :::::           |
|                                         ...:::..  .       |
|                                    :..:::::::::::::....   |
|                                   .:::::::::::::::::::'   |
|                                 ::::::::::::::::::::::::  |
|                                :::::::::::::::::::::::::' |
|                     :..:::.   .:::::::::::::::::::::::::: |
|                   .:::::::::. ::::::::::::::::::::::::::  |
|                .. ::::::::::: :::::::::::::::::::::::::'  |
|      '  '     '::':::::::::::'::::::::::::::::::::::::.   |
|                   ::::::::::: ::::::::::::::::::::::::::  |
|                    ':::::::'  ::::::::::::::::::::::::::. |
|                     '  '''     :::::::::::::::::::::::::' |
|                                '::::::::::::::::::::::::' |
|                                 ''::::::::::::::::::::''  |
|                                    ::::::::::::::::::::   |
|                                    '  ''::::::::'':       |
|                                           .:::.           |
|                                           ':::'           |
|                                             :             |
|___________________________________________________[0.5,-1]|

Code

The following code is a simple C program that renders the Mandelbrot set into terminal (for demonstrative purposes, it isn't efficient or do any antialiasing).

#include <stdio.h>

#define ROWS 30
#define COLS 60
#define FROM_X -2.0
#define FROM_Y 1.0
#define STEP (2.5 / ((double) COLS))

unsigned int mandelbrot(double x, double y)
{
  double cx = x, cy = y, tmp;

  for (int i = 0; i < 1000; ++i)
  {
    tmp = cx * cx - cy * cy + x;
    cy = 2 * cx * cy + y;
    cx = tmp;

    if (cx * cx * cy * cy > 1000000000)
      return 0;
  }

  return 1;
}

int main(void)
{
  double cx, cy = FROM_Y;

  for (int y = 0; y < ROWS; ++y)
  {
    cx = FROM_X;

    for (int x = 0; x < COLS; ++x)
    {
      unsigned int point = 
        mandelbrot(cx,cy) + (mandelbrot(cx,cy + STEP) * 2);   

      putchar(point == 3 ? ':' : (point == 2 ? '\'' : 
        (point == 1 ? '.' : ' ')));

      cx += STEP;
    }

    putchar('\n');

    cy -= 2 * STEP;
  }

  return 0;
}

marble_race

Marble Race

Marble race is a simple real life game in which marbles (small glass balls) are released onto a prepared track to race from start to finish by the force of gravity. This game is great because it is suckless, cheap (i.e. accessible), fun and has almost no dependencies, not even a computer -- such a game will be playable even after the technological collapse.

Even though this is a real life game, a computer version can be made too, in different forms: 2D, 3D, realistic or with added elements that would be impossible in real life, such as teleports. And indeed, there have been many games and mini-games made based on this form of entertainment.

From the implementation point of view it is very convenient that marbles are of spherical shape as this is one of the simplest shapes to handle in physics engines.


marketing

Marketing

Marketing is an unethical practice, plentifully used in capitalism, of forcing a product or corporate propaganda by means of lying, tricks, brainwashing, torture, exploiting psychological weaknesses of people and others. This manifests mostly as advertisements and commercials in media but also in other ways such as fake product reviews, product placement etc.

Specific practices used in marketing are:

These practices are not rare, they are not even a behavior of a minority, they are not illegal and people don't even see them as unusual or undesirable. People in the US are so brainwashed they even pay to see commercials (Super Bowl). Under capitalism these practices are the norm and are getting worse and worse ever year.

A naive idea still present among people is that ethical marketing is possible or that it's something that can be fixed by some law, a petition or something similar. In late stage capitalism this is not possible as an "ethical" marketing is a non effective marketing. Deciding to drop the most efficient weapons in the market warfare will only lead to the company losing customers to competition who embraces the unethical means, eventually going bankrupt and disappearing, leaving the throne to the bad guys. Laws will not help as laws are made to firstly favor the market, corporations pay full time lobbyists and law makers themselves are owners of corporations. Even if some small law against "unethical marketing" passes, the immense force and pressure of all the strongest corporations will work 24/7 on reverting the law and/or finding ways around it, legal or illegal, ethical or unethical.

Marketing people are subhuman. Of course, let us be reminded we love all living beings, even subhuman, but the marketing trash not only doesn't show any signs of conscience or morals, they hardly seems conscious at all, they are just a robotic tool of capitalism -- however immoral shit they get into, they always just reply "just doing my job" and "it pays well" to anything. They make the worst kind of propaganda which literally kills people, they would mercilessly torture children to death if it was on their contract. A capitalist is screeching HAHAHA IT NOT THE SAME bcuz CHILREN ARE MAGICAL n economy is pwogwesss, so this invalid. Indeed, it doesn't make any sense -- a capitalist will stay what it is, the lowest class of brainwashed NPC incapable of thinking on its own. All in all, avoid anyone who has anything to do with marketing.


markov_chain

Markov Chain

Markov chain is a relatively simple stochastic (working with probability) mathematical model for predicting or generating sequences of symbols. It can be used to describe some processes happening in the real world such as behavior of some animals, Brownian motion or structure of a language. In the world of programming Markov chains are pretty often used for generation of texts that look like some template text whose structure is learned by the Markov chain (Markov chains are one possible model used in machine learning). Chatbots are just one example.

There are different types of Markov chains. Here we will be focusing on discrete time Markov chains with finite state space as these are the ones practically always used in programming. They are also the simplest ones.

Such a Markov chain consists of a finite number of states S0, S1, ..., Sn. Each state Si has a certain probability of transitioning to another state (including transitioning back to itself), i.e. P(Si,S0), P(Si,S1), ..., P(Si,Sn); these probabilities have to, of course, add up to 1, and some of them may be 0. These probabilities can conveniently be written as a n x n matrix.

Basically Markov chain is like a finite state automaton which instead of input symbols on its transition arrows has probabilities.

Example

Let's say we want to create a simple AI for an NPC in a video game. At any time this NPC is in one of these states:

Now it's pretty clear this description gets a bit tedious, it's better, especially with even more states, to write the probabilities as a matrix (rows represent the current state, columns the next state):

A B C D
A 0.5 0.5 0 0
B 0 0.5 0.25 0.25
C 0.1 0.1 0.7 0.1
D 0.25 0.25 0.5 0

We can see a few things: the NPC can't immediately attack from cover, it has to search for a target first. It also can't throw two grenades in succession etc. Let's note that this model will now be yielding random sequences of actions such as [cover, search, shoot, shoot, cover] or [cover, search, search, grenade, shoot] but some of them may be less likely (for example shooting 3 bullets in a row has a probability of 0.1%) and some downright impossible (e.g. two grenades in a row). Notice a similarity to for example natural language: some words are more likely to be followed by some words than others (e.g. the word "number" is more likely to be followed by "one" than for example "cat").


math

Mathematics

Mathematics (also math or maths, from Greek mathematicos, learned) is the best science (yes, it is a formal science), which deductively deals with numbers and other abstract structures with the use of pure logic, in as rigorous and objective way as possible. In fact it's the only true science that can actually prove things thanks to its tool of mathematical proof (other sciences may only disprove or show something to be very likely). It is immensely important in programming and computer science. Mathematics is possibly the intellectually most difficult field to study in depth, meant for the smartest people; the difficulty, as some mathematicians themselves say, comes especially from the extremely deep abstraction (pure mathematics often examines subjects that have no known connection to reality and only exist as a quirk of logic itself). It is said that mathematics is the only universal language in our universe -- if we ever get in contact with an intelligent alien civilization, mathematics is likely to be used for communication.

Some see math not as a science but rather a discipline that develops formal tools for "true sciences". The reasoning is usually that a science has to use scientific method, but that's a limited view as scientific method is not the only way of obtaining reliable knowledge. Besides that math can and does use the principles of scientific method -- mathematicians first perform "experiments" with numbers and generalize into conjectures and later "strong beliefs", however this is not considered good enough in math as it actually has the superior tool of proof that is considered the ultimate goal of math. I.e. math relies on deductive reasoning (proof) rather than less reliable inductive reasoning (scientific method) -- in this sense mathematics is more than a science.

Soydevs, coding monkeys (such as webdevs) and just retards in general hate math because they can't understand it. They think they can do programming without math, which is just ridiculous. This delusion stems mostly from these people being highly incompetent and without proper education -- all they've ever seen was a shallow if-then-else python "coding" of baby programs or point-and-click "coding" in gigantic GUI frameworks such as Unity where everything is already preprogrammed for them. By Dunning–Kruger they can't even see how incompetent they are and what real programming is about. In reality, this is like thinking that being able to operate a calculator makes you a capable mathematician or being able to drive a car makes you a capable car engineer. Such people will be able to get jobs and do some repetitive tasks such as web development, Unity game development or system administration, but they will never create anything innovative and all they will ever make will be ugly, bloated spaghetti solution that will likely do more harm than good.

On the other hand, one does not have to be a math PhD in order to be a good programmer in most fields. Sure, knowledge and overview of advanced mathematics is needed to excel, to be able to spot and sense elegant solutions, but beyond these essentials that anyone can learn with a bit of will it's really more about just not being afraid of math, accepting and embracing the fact that it permeates what we do and studying it when the study of a new topic is needed.

The power of math is limited. In 1932 Kurt Godel mathematically proved, with his incompleteness theorems, that (basically) there are truths which math itself can never prove, and that math itself cannot prove its own consistency (which killed so called Hilbert's program which sought to do exactly that). This is related to the limited power of computers due to undecidability (there are problems a computer can never decide), proven by Alan Turing.

Overview

Following are some math areas and topics which a programmer should be familiar with:


memory_management

Memory Management

In programming memory management is (unsurprisingly) the act and various techniques of managing the working memory (RAM) of a computer, i.e. for example dividing the total physically available memory among multiple memory users such as operating system processes and assuring they don't illegally access each other's part of memory. The scope of the term may differ depending on context, but tasks falling under memory management may include e.g. memory allocation (finding and assigning blocks of free memory) and deallocation (freeing such blocks), ensuring memory safety, organizing blocks of memory and optimizing memory access (e.g. with caches or data reorganization), memory virtualization and related tasks such as address translation, handling out-of-memory exceptions etc.

Memory management can be handled at different levels: hardware units such as the MMU and CPU caches exist to perform certain time-critical memory-related tasks (such as address translation) quickly, operating system may help with memory management (e.g. implement virtual memory and offer syscalls for dynamic allocation and deallocation of memory), a programming language may do some automatic memory management (e.g. garbage collection or handling call stack) and programmer himself may do his own memory management (e.g. deciding between static and dynamic allocation or choosing the size of dynamic allocation chunk).

Why all this fuzz? As a newbie programmer who only works with simple variables and high level languages like Python that do everything for you you don't need to do much memory management yourself, but when working with data whose size may wildly differ and is not known in advance (e.g. files), someone has to handle e.g. the possibility of the data on disk not being able to fit to RAM currently allocated for your program, or -- if the data fits -- there may not be a big enough continuous chunk of memory for it. If we don't know how much memory a process will need, how much memory do we give it (too little and it may not be enough, too much and there will not be enough memory for others)? Someone has to prevent memory leaks so that your computer doesn't run out of memory due to bugs in programs. With many processes running simultaneously on a computer someone has to keep track of which process uses which part of memory and ensure collisions (one process overwriting another processe's memory) don't happen, and someone needs to make sure that if bad things happen (such as process trying to write to a memory that doesn't belong to it), they don't have catastrophic consequences like crashing or exploding the system.

Memory Management In C

In C -- a low level language -- you need to do a lot of manual memory management and there is a big danger of fucking up, especially with dynamic allocation -- C won't hold your hand (but as a reward your program will be fast and efficient), there is no uber memory safety. There is no automatic garbage collection, i.e. if you allocate memory dynamically, YOU need to keep track of it and manually free it once you're done using it, or you'll end up with memory leak.

For start let's see which kinds of allocation (and their associated parts of memory) there are in C:

Rule of the thumb: use the simplest thing possible, i.e. static allocation if you can, if not then automatic and only as the last option resort to dynamic allocation. The good news is that you mostly won't need dynamic allocation -- you basically only need it when working with data whose size can potentially be VERY big and is unknown at compile time (e.g. you need to load a WHOLE file AT ONCE which may potentially be VERY big). In other cases you can get away with static allocation (just reserving some reasonable amount of memory in advance and hope the data fits, e.g. a global array such as int myData[DATA_MAX_SIZE]) or automatic allocation if the data is reasonably small (i.e. you just create a variable sized array inside some function that processes the data). If you end up doing dynamic allocation, be careful, but it's not THAT hard to do it right (just pay more attention) and there are tools (e.g. valgrind) to help you find memory leaks. However by the principles of good design you should avoid dynamic allocation if you can, not only because of the potential for errors and worse performance, but most importantly to avoid dependencies and complexity.

For pros: you can also create your own kind of pseudo dynamic allocation in pure C if you really want to avoid using stdlib or can't use it for some reason. The idea is to allocate a big chunk of memory statically (e.g. global unsigned char myHeap[MY_HEAP_SIZE];) and then create functions for allocating and freeing blocks of this static memory (e.g. myAlloc and myFree with same signatures as malloc and free). This allows you to use memory more efficiently than if you just dumbly (is it a word?) preallocate everything statically, i.e. you may need less total memory; this may be useful e.g. on embedded. Yet another uber hack to "improve" this may be to allocate the "personal heap" on the stack instead of statically, i.e. you create something like a global pointer unsigned char *myHeapPointer; and a global variable unsigned int myHeapSize;, then somewhere at the beginning of main you compute the size myHeapSize and then create a local array myHeap[myHeapSize], then finally set the global pointer to it as myHeapPointer = myHeap; the rest remains the same (your allocation function will access the heap via the global pointer). Just watch out for reinventing wheels, bugs and that you actually don't end up with a worse mess that if you took a more simple approach. Hell, you might even try to write your own garbage collection and array bound checking and whatnot, but then why just not fuck it and use an already existing abomination like Java? :)

Finally let's see some simple code example:

#include <stdio.h>
#include <stdlib.h> // needed for dynamic allocation :(

#define MY_DATA_MAX_SIZE 1024 // if you'll ever need more, just change this and recompile

unsigned char staticMemory[MY_DATA_MAX_SIZE]; // statically allocated array :)
int simpleNumber; // this is also allocated statically :)

void myFunction(int x)
{
  static int staticNumber;  // this is allocated statically, NOT on stack
  int localNumber;          // this is allocated on stack
  int localArray[x + 1];    // variable size array, allocated on stack, hope x isn't too big

  localNumber = 2 * x;      // do something with the memory
  localArray[x] = localNumber;

  if (x > 0)                // recursively call the function
    myFunction(x - 1);
}

int main(void)
{
  int localNumberInMain = 123; // this is also allocated on stack    

  myFunction(10);  // change to 10000000 to see a probable stack overflow

  for (int i = 0; i < 200000; ++i)
  {
    if (i % 1000 == 0)
      printf("i = %d\n",i);

    unsigned char *dynamicMemory = (char *) malloc((i + 1) * 10000); // oh no, dynamic allocation, BLOAAAT!

    if (!dynamicMemory)
    {
      printf("Couldn't allocate memory, there's probably not enough of it :/");
      return 1;
    }

    dynamicMemory[i * 128] = 123; // do something with the memory

    free(dynamicMemory); // if not done, memory leak occurs! try to remove this and see :)         
  }

  return 0;
}

mental_outlaw

Mental Outlaw

Mental Outlaw is a black/N-word youtuber/vlogger focused on FOSS and, to a considerable degree, suckless software. He's kind of a copy-paste of Luke Smith but a little closer to the mainstream and normies.

Like with Luke, sometimes he's real based and sometimes he says very stupid stuff. Make your own judgement.


microsoft

Micro$oft

Micro$soft (officially Microsoft, MS) is a terrorist organization, software corporation named after its founder's dick -- it is, along with Google, Apple et al one of the biggest organized crime groups in history, best known for holding the world captive with its highly abusive "operating system" called Windows, as well as for leading an aggressive war on free software and utilizing many unethical and/or illegal business practices such as destroying any potential competition with the Embrace Extend Extinguish (actual terminology internally used at Microsoft) strategy or lately practicing heavy openwashing.

{ Techrights documents Microsoft nicely on their wiki, see e.g. "Microsoft sins" at http://techrights.org/wiki/List_of_Microsoft_Sins, which among others list illegally shooting an antelope, tax evasion, attacking (physically and verbally) employees who leave for other companies and bribing bloggers to write positive reviews. ~drummyfish }

Microsoft is unfortunately among the absolutely most powerful entities in the world (that sucks given they're also among the most hostile ones) -- likely more powerful than any government and most other corporations, it is in their power to immediately destroy any country with the push of a button, it's just a question of when this also becomes their interest. This power is due to them having complete control over almost absolute majority of personal computers in the world (and therefore by extension over all devices, infrastructure, organization etc.), through their proprietary (malware) "operating system" Windows that has built-in backdoor, allowing Microsoft immediate access and control over practically any computer in the world. The backdoor "feature" isn't even hidden, it is officially and openly admitted (it is euphemistically called auto updates). Microsoft prohibits studying and modification of Windows under threats including physical violence (tinkering with Windows violates its EULA which is a lawfully binding license, and law can potentially be enforced by police using physical force). Besides legal restrictions Microsoft applies high obfuscation, bloat, SAASS and other techniques preventing user freedom and defense against terrorism, and forces its system to be installed in schools, governments, power plants, hospitals and basically on every computer anyone buys. Microsoft can basically (for most people) turn off the Internet, electricity, traffic control system etc. Therefore every hospital, school, government and any other institution has to bow to Microsoft.

TODO: it would take thousands of books to write just a fraction of all the bad things, let's just add the most important ones


microtheft

Microtheft

See microtransaction.


microtransaction

Microtransaction

Microtransaction, also microtheft, is the practice of selling -- for a relatively "low" price -- virtual goods in some virtual environment, especially games, by the owner of that environment. It's a popular business model of many capitalist games -- players have an "option" (which they are pushed to take) to buy things such as skins and purely cosmetic items but also items giving an unfair advantage over other players (in-game currency, stronger weapons, faster leveling, ...). This is often targeted at children.

Not only don't they show you the source code they run on your computer, not only don't they even give you an independently playable copy of the game you paid for, not only do they spy on you, they also have the audacity to ask for more and more money after you've already paid for the thing that abuses you.


minigame

Minigame

Minigame is a very small and simple game intended to entertain the player in a simple way, usually for only a short amount of time, unlike a full fledged game. Minigames may a lot of times be embedded into a bigger game (as an easter egg or as a part of a game mechanic such as lock picking), they may come as an extra feature on primarily non-gaming systems, or appear in collections of many minigames as a bigger package (e.g. various party game collections). Minigames include e.g. minesweeper, sokoban, the Google Chrome T-rex game, Simon Tatham's Portable Puzzle Collection, as well as many of the primitive old games like Pong and Tetris. Minigames are nice from the LRS point of view as they are minimalist, simple to create, often portable, while offering a potential for great fun nevertheless.

Minigame is an ideal project for learning programming.

Despite the primary purpose of minigames many players invest huge amounts of time into playing them, usually competitively e.g. as part of speedrunning.

Minigames are still very often built on the principles of old arcade games such as getting the highest score or the fastest time. For this they can greatly benefit from procedural generation (e.g. endless runners).

List Of Minigames

This is a list of just some of many minigames and minigame types.


minimalism

Technological Minimalism

No gain, no pain.

Technological minimalism is a philosophy of designing technology to be as simple as possible while still achieving given goal, possibly even a little bit simpler. Minimalism is one of the most (if not the most) important concepts in programming and technology in general. Remember, minimalist is firstly about internal simplicity, i.e. the simplicity of design/repairing/hacking, and only secondly about the simplicity from the user's point of view (otherwise we are only dealing with pseudominimalism).

Antoine de Saint-Exupéry sums it up with a quote: we achieve perfection not when there is nothing more to add, but when there is nothing left to take away.

Minimalism is necessary for freedom as a free technology can only be that over which no one has a monopoly, i.e. which many people and small parties can utilize, study and modify with affordable effort, without needing armies of technicians just for the maintenance of such technology. Minimalism goes against the creeping overcomplexity of technology which always brings huge costs and dangers, e.g. the cost of maintenance and further development, obscurity, inefficiency ("bloat", wasting resources), consumerism, the increased risk of bugs, errors and failure.

Up until recently in history every engineer would tell you that the better machine is that with fewer moving parts. This still seems to hold e.g. in mathematics, a field not yet so spoiled by huge commercialization and mostly inhabited by the smartest people -- there is a tendency to look for the most minimal equations -- such equations are considered beautiful. Science also knows this rule as the Occam's razor. In technology invaded by aggressive commercialization the situation is different, minimalism lives only in the underground and is ridiculed by the mainstream propaganda. Some of the minimalist movements, terms and concepts include:

Under capitalism technological minimalism is suppressed in the mainstream as it goes against corporate interests, i.e. those of having monopoly control over technology, even if such technology is "FOSS" (which then becomes just a cool brand, see openwashing). We may, at best, encounter a "shallow" kind of minimalism, so called pseudominimalism which only tries to make things appear minimal, e.g. aesthetically, and hides ugly overcomplicated internals under the facade. Apple is famous for this shit.

There are movements such as appropriate technology (described by E. F. Schumacher in a work named Small Is Beautiful: A Study of Economics As If People Mattered) advocating for small, efficient, decentralized technology, because that is what best helps people.

Does minimalism mean we have to give up the nice things? Well, not really, it is more about giving up the bullshit, and changing an attitude. We can still have technology for entertainment, just a non-consumerist one -- instead of consuming 1 new game per month we may rather focus on creating deeper games that may last longer, e.g. those of a easy to learn, hard to master kind and building communities around them, or on modifying existing games rather than creating new ones from scratch over and over. Sure, technology would LOOK different, our computer interfaces may become less of a thing of fashion, our games may rely more on aesthetics than realism, but ultimately minimalism can be seen just as trying to achieve the same effect while minimizing waste. If you've been made addicted to bullshit such as buying a new GPU each month so that you can run games at 1000 FPS at progressively higher resolution then of course yes, you will have to suffer a bit of a withdrawal just as a heroin addict suffers when quitting the drug, but just as him in the end you'll be glad you did it.

There is a so called airplane rule that states a plane with two engines has twice as many engine problems than a plane with a single engine.

Importance Of Minimalism: Simplicity Brings Freedom

It can't be stressed enough that minimalism is absolutely required for technological freedom, i.e. people having, in practical ways, control over their tools. While in today's society it is important to have legal freedoms, i.e. support free software, we must not forget that this isn't enough, a freedom on paper means nothing if it can't be practiced. We need both legal AND de facto freedom over technology, the former being guaranteed by a free license, the latter by minimalism. Minimal, simple technology will increase the pool of people and parties who may practice the legal freedoms -- i.e. those to use, study, modify and share -- and therefore ensure that the technology will be developed according to what people need, NOT according to what a corporation needs (which is usually the opposite).

Even if a user of software is not a programmer himself, it is important he chooses to use minimal tools because that makes it more likely his tool can be repaired or improved by SOMEONE from the people. Some people naively think that if they're not programmers, it doesn't matter if they have access and rights to the program's source code, but indeed that is not the case. You want to choose tools that can easily be analyzed and repaired by someone, even if you yourself can't do it.

Minimalism and simplicity increases freedom even of proprietary technology which can be seen e.g. on games for old systems such as GameBoy or DOS -- these games, despite being proprietary, can and are easily and plentifully played, modified and shared by the people, DESPITE not being free legally, simply because it is easy to handle them due to their simplicity. This just further confirms the correlation of freedom and minimalism.


mipmap

Mipmap

Mipmap (from Latin multum in parvo, many in little), is a digital image that is stored along with progressively smaller versions of itself; mipmaps are useful in computer graphics, especially as a representation of textures in which they may eliminate aliasing during rendering. But mipmaps also have other uses such as serving as acceleration structures or helping performance (using a smaller image can speed up memory access). Mipmaps are also sometimes called pyramids because we can imagine the images of different sizes laid one on another to form such a shape.

A basic form of a mipmap can be explained on the following example. Let's say we have an RGB image of size 1024x1024 pixels. To create its mipmap we call the base image level 0 and create progressively smaller versions (different levels) of the image by reducing the size four times (twice along one dimension) at each step. I.e. level 1 will be the base image downscaled to the size 512x512. If we are to use the mipmap for the common purpose of reducing aliasing, the downscaling itself has to be done in a way that doesn't introduce aliasing; this can be done e.g. by downscaling 2x2 areas in the base image into a single pixel by averaging the values of those 4 pixels (the averaging is what will prevent aliasing; other downscaling methods may be used depending on the mipmap's purpose, for example for a use as an accelerating structure we may take a maximum or minimum of the 4 pixels). Level 2 will be an image with resolution 256x256 obtained from the 512x512 image, and so on until the last level with size 1x1. In this case we'll have 11 levels which together form our mipmap.

This RGB mipmap can be shown (and represented in memory) as a "fractal image":

     _______________________________
    |               |               |
    |               |               |
    |    level 0    |    level 0    |
    |      red      |     green     |
    |    channel    |    channel    |
    |               |               |
    |               |               |
    |_______________|_______________|
    |               |level 1|level 1|
    |               |  red  | green |
    |    level 0    |channel|channel|
    |      blue     |_______|_______|
    |    channel    |level 1|l2r|l2g|
    |               |  blue |___|___|
    |               |channel|l2b|_|_|
    |_______________|_______|___|_|+|

This may be how a texture is represented inside a graphics card if we upload it (e.g. with OpenGL). When we are rendering e.g. a 3D model with this texture and the model ends up being rendered at the screen in such size that renders the texture smaller than its base resolution, the renderer (e.g. OpenGL) automatically chooses the correct level of the mipmap (according to Nyquist-Shannon sampling theorem) to use so that aliasing won't occur. If we're using a rendering system such as OpenGL, we may not even notice this is happening, but indeed it's what's going on behind the scenes (OpenGL and other systems have specific functions for working with mipmaps manually if you desire).

Do we absolutely need to use mipmaps in rendering? No, some simple (mostly software) renderers don't use them and you can turn mipmaps off even in OpenGL. Some renderers may deal with aliasing in other ways, for example by denser sampling of the texture which will however be slower (in this regard mipmaps can be seen as precomputed, already antialiased version of the image which trades memory for speed).

We can also decide to not deal with aliasing in any way, but the textures will look pretty bad when downscaled on the screen (e.g. in the distance). They are kind of noisy and flickering, you can find examples of this kind of messy rendering online. However, if you're using low resolution textures, you may not even need mipmaps because such textures will hardly ever end up downscaled -- this is an advantage of the KISS approach.

One shortcoming of the explained type of mipmaps is that they are isotropic, i.e. they suppose the rendered texture will be scaled uniformly in all directions, which may not always be the case, especially in 3D rendering. Imagine a floor rendered when the camera is looking forward -- the floor texture may end up being downscaled in the vertical direction but upscaled in the horizontal direction. If in this case we use our mipmap, we will prevent aliasing, but the texture will be rendered in lower resolution horizontally. This is because the renderer has chosen a lower resolution of the texture due to downscale (possible aliasing) in vertical direction, but horizontal direction will display the texture upscaled. This may look a bit weird, but its completely workable, it can be seen in most older 3D games.

The above issue is addressed mainly by two methods.

The first is trilinear filtering which uses several levels of the mipmap at once and linearly blends between them. This is alright but still shows some artifacts such as visible changes in blurriness.

The second method is anisotropic filtering which uses different, anisotropic mipmaps. Such mipmaps store more version of the image, resized in many different ways. This method is nowadays used in quality graphics.


mob_software

Mob Software

Not to be confused with mob programming.

TODO (read https://www.dreamsongs.com/MobSoftware.html)


moderation

Moderation

Moderation is an euphemism for censorship encountered mostly in the context of Internet communication platforms (forum discussions, chats etc.).


modern

Modern

"Everything that modern culture hates is good, and everything that modern culture loves is bad." --fschmidt from reactionary software

So called modern software/hardware and other technology might as well be synonymous with shitty abusive technology. In a capitalist age when everything is getting progressively worse in terms of design, quality, ethicality, efficiency, etc., newer means worse, therefore modern (newest) means the worst. In other words modern is a term that stands for "as of yet best optimized for exploiting users". At LRS we see the term modern as negative -- for example whenever someone says "we work with modern technology", he is really saying "we are working with worst technology we know of".

The word modern was similarly addressed e.g. by reactionary software -- it correctly identifies the word as being connected to a programming orthodoxy of current times, the one that's obsessed with creating bad technology and rejecting good technology. { I only found reactionary software after this article has been written. ~drummyfish }

Modern Vs Old Technology

It's sad and dangerous that newer generation won't even remember technology used to be better, people will soon think that the current disgusting state of technology is the best we can do. That is of course wrong, technology used to be relatively good. It is important we leave here a note on at least a few ways in which old was much, much better.

(INB4 "it was faster and longer on battery etc. because it was simpler" -- yes, that is exactly the point.)


modern_software

Modern Software

Go here.


monad

Monad

{ This is my poor understanding of a monad. ~drummyfish }

Monad is a mathematical concept which has become useful in functional programming and is one of the very basic design patterns in this paradigm. A monad basically wraps some data type into an "envelope" type and gives a way to operate with these wrapped data types which greatly simplifies things like error checking or abstracting input/output side effects.

A typical example is a maybe monad which wraps a type such as integer to handle exceptions such as division by zero. A maybe monad consists of:

  1. The maybe(T) data type where T is some other data type, e.g. maybe(int). Type maybe(T) can have these values:
  1. A special function return(X) that converts value of given type into this maybe type, e.g. return(3) will return just(3)
  2. A special combinator X >>= f which takes a monadic (maybe) values X and a function f and does the following:

Let's look at a pseudocode example of writing a safe division function. Without using the combinator it's kind of ugly:

divSafe(x,y) = // takes two maybe values, returns a maybe value
  if x == nothing
    nothing else
    if y == nothing
      nothing else
        if y == 0
          nothing else
            just(x / y)

With the combinator it gets much nicer (note the use of lambda expression):

divSafe(x,y) =
  x >>= { a: y >== { b: if b == 0 nothing else a / b } }

Languages will typicall make this even nicer with a syntax sugar such as:

divSafe(x,y) = do
  a <- x,
  b <- y,
  if y == 0 nothing else return(a / b)

TODO: I/O monad TODO: general monad TODO: example in real lang, e.g. haskell


money

Money

Money spoils everything, and in capitalism money is everywhere.


morality

Morality

Morality is the sense of greater values of an individual and society from which it follows what's ultimately right, wrong, good and bad/evil on a greater level, for a "greater good", without succumbing to low instincts such as self interest, self preservation, immediate pleasure etc. Morality is what greatly distinguishes man from animal and allows him to act not on mere instincts and reactions to immediate stimuli, it is driven by the higher forces such as beliefs, logic, empathy, love, conscience, religion and science. Examples of moral (good) behavior include altruism, selflessness, communism in general sense, less retarded society and non violence, while examples of IMMORALITY (evil) might be capitalism, fascism, rape, pedophobia, genocide, marketing, proprietary software, nationalism and LGBT.

Morality is very similar to ethics, to the point of often being used interchangeably, however we may still find slight differences. While morality is seen as something personal and intuitive, greatly driven by conscience and judged on a case-by-case basis, ethics is perceived more as a set of informal, often unwritten shared rules to assure morality in a larger group of individuals, i.e. ethics is an agreement on a way of behavior between individuals, each of which may have slightly different personal morals. Ethics is also sometimes defined as the branch of philosophy concerned with examining morality.

Morality is much different from legality. Ideally it is said that laws should be the minimum (a proper subset) of morality, i.e. laws should be the officially codified, approved and enforced rules that ensure the very basic moral behavior is sustained, such as people not murdering others, however laws CANNOT with the best of our effort ever capture the infinitely complex nature of morals (no one can ever write down EXACTLY what is and isn't moral in every single imaginable situation that can arise in real world), so it is seen as inevitable that laws will always allow some slightly immoral actions (imagine e.g. someone giving a bad advice to someone else on purpose just to see the other one fail -- this may be legal but is likely immoral). This is accepted because the other option, i.e. law trying to prevent ALL immoral behavior, would be too restrictive and would also inevitably prevent a huge amount of moral, useful and essential behavior; imagine e.g. law trying to prevent giving bad advice by banning all communication altogether. However, this ideal of "laws as a minimum of morals" doesn't hold in practice because law is hugely abused and manipulated to serve the evil, so not only does it allow immoral behavior (which would be kind of OK), it BANS moral behavior (which is unacceptable from the idealist point of view), for example it is prohibited to share useful information ("intellectual property"), repairing (DRM), living in an abandoned house one doesn't "officially own" etc. Furthermore laws themselves in principle have a negative effect on morality because people unfortunately start replacing morality with legality; as laws get more complex and in control of our everyday lives, people only start deciding and judging actions based on a question of "is it legal?" rather than "is it moral?" -- indeed, if nowadays you accuse someone of doing something wrong, he will almost definitely reply something along the lines of "I can legally do that so shut up." Laws destroy morality, hence laws have to be cancelled (see anarchism) and we have to focus only on developing our sense of morality better.

See Also


motivation

Motivation

There are too many motivated people, and too few passionate ones.


murderer

Murderer

You misspelled entrepreneur.


music

Music

Music is an auditory art whose aim is to create pleasant sound of longer duration that usually adheres to some rules and structure, such as those of melody, harmony, rhythm and repetition. Music has played a huge role throughout all history of human culture. It is impossible to precisely define what music is as the term is fuzzy, i.e. its borders are unclear; what one individual or culture considers music may to another one sound like noise without any artistic value, and whatever rule we set in music is never set in stone and will be broken by some artists (there exists music without chords, melody, harmony, rhythm, repetition... even without any sound at all, see Four Minutes Thirty Three Seconds). Music is mostly created by singing and playing musical instruments such as piano, guitar or drums, but it may contain also other sounds; it can be recorded and played back, and in all creation, recording and playing back computers are widely used nowadays.

Music is deeply about math, though most musicians don't actually have much clue about it and just play "intuitively", by feel and by the ear. Nevertheless the theory of scales, musical intervals, harmony, rhythm and other elements of music is quite complex;

Copyright of music: TODO (esp. soundfonts etc.).

Modern Western Music + How To Just Make Noice Music Without PhD

{ I don't actually know that much about the theory, I will only write as much as I know, which is possibly somewhat simplified, but suffices for some kind of overview. Please keep this in mind and don't eat me. ~drummyfish }

Our current western music is almost exclusively based on major and minor diatonic scales with 12 equal temperament tuning -- i.e. basically our scales differ just by their transposition and have the same structure, that of 5 whole tone steps and 2 semitone steps in the same relative places, AND a semitone step always corresponds to the multiplying factor of 12th root of 2 -- this all is basically what we nowadays find on our pianos and in our songs and other compositions. 4/4 rhythm is most common but other ones appear, e.g. 3/4. Yeah this may sound kinda too nerdy, but it's just to set clear what we'll work with in this section. Here we will just suppose this kind of music. Also western music has some common structures such as verses, choruses, bridges etc.; lyrics of music follow many rules of poetry, they utilizes rhymes, rhythm based on syllables, choice of pleasant sounding words etc.

Why are we using this specific scale n shit, why are the notes like this bruh? TODO

Music is greatly about breaking the rules, like most other art anyway -- you have to learn the rules and respect them most of the time, but respecting them all the time results in sterile, soulless music; a basic rule is therefore to break the rules IN APPROPRIATE PLACES, i.e. where such a rule break will result in emotional response, in something interesting, unique, ... This includes for example leaving the scale for a while, adding disharmony, adding/skipping a beat in one bar, ...

If you wanna learn music, firstly you should get something with piano keyboard: musical keyboard, electronic piano, even virtual software piano, ... The reason being that the keys really help you understand what's going on, the piano keyboard quite nicely visually represent the notes (there is a reason every music software uses the piano roll). Guitar or flute on the other hand will seem much more confusing; of course you can learn these instruments, but first start with the piano keyboard.

    | |C#| |D#| | |F#| |G#||A#| | |
    | |Db| |Eb| | |Gb| |Ab||Bb| | |
    | |__| |__| | |__| |__||__| | |__
... |   |   |   |   |   |   |   |   | ...
    | C | D | E | F | G | A | B | C |
   _|___|___|___|___|___|___|___|___|__   

Tones on piano keyboard, the "big keys" are white, the "smaller keys on top" are black.

OK so above we have part of a piano keyboard, tones go from lower (left) to higher (right), the keyboard tones just repeat the same above and below. The white keys are named simply A, B, C, ..., the black keys are named by their neighboring white key either by adding # (sharp) to the left note or by adding b (flat) to the right note (notes such as C# and Db can be considered the same withing the scales we are dealing with). Note: it is convenient to see C as the "start tone" (instead of A) because then we get a nice major scale that has no black keys in it and is easy to play on piano; just ignore this and suppose we kind of "start" on C for now.

Take a look at the C note at the left for example; we can see there is another C on the right; the latter C is one octave above, i.e. it is the "same" note by name but it is also higher (for this we sometimes number the notes as C2, C3 etc.). The same goes for any other tone, each one has its different versions in different octaves. Kind of like the color red has different versions, a lighter one, a darker one etc. Octave is a basic interval we have to remember, a tone that's one octave above another tone has twice its frequency, so e.g. if C2 has 65 hertz, C3 has 130 hertz etc. This means that music intervals are logarithmic, NOT linear! I.e. an interval (such as octave) says a number by which we have to MULTIPLY a frequency to get the higher frequency, NOT a number which we would have to add. This is extremely important.

Other important intervals are tone and semitone. Semitone is a step from one key to the immediately next key (even from white to black and vice versa), for example from C to C#, from E to F, from G# to A etc. A tone is two semitones, e.g. from C to D, from F# to G# etc. There are 12 semitones in one octave (you have to make 12 steps from one tone to get to that tone's higher octave version), so a semitone has a multiplying factor of 2^1/12 (12th root of two). For example C2 being 65 hertz, D2 is 65 * 2^1/12 ~= 69 hertz. This makes sense as if you make 12 steps then you just multiply 12th root of two twelve times and are left simply with multiply by 2, i.e. one octave.

TODO: chords, scales, melody, harmony, beat, bass, drums, riffs, transpositions, tempo, polyphony ...

Music And Computers/Programming

TODO: midi, bytebeat, tracker music, waveforms, formats, procedural music, AI music, ...


myths

Myths

This is a list of myths and common misconceptions.


name_is_important

Name Is Important

Name of a philosophy, project, movement, group, ideology etc. plays a more significant role than a common man believes. A naive view is that name is just an identifier, a common man will rather believe promise of politician than the name of his party; however name is much more than a mere string of letters, it is the single most stable defining feature of an entity; everything else, all the books and knowledge associated with it may be distorted by history, but the name will always stay the same and will hold a scrutiny over all actions of the entity, it will always be a permanent reminder to every follower of what he is trying to achieve. But what if the name of the movement changes? Then it is to be considered a new, different movement. The name usually holds the one true goal.

For this we have to keep in mind two things:

Nevertheless keep in mind that while the power of the name is great, it is not infinite and the above may not hold if stronger forces are at play -- there have been many cases of name abuse in history, notably e.g. by Nazism whose name stands for "national socialism" but whose actions were completely anti-socialist, or so called "Anarcho" capitalism which abuses the name anarchism despite being completely anti-anarchist.

Let us comment on a few examples:


nc

NC

See also ND.

In the context of licenses the acronym NC stands for non-commercial and means "only non-commercial use allowed", which is an unpopular limitation that makes such a license by definition proprietary (i.e. NOT a free cultural license). This means that a work shared under a license with NC clause is prohibited from being used commercially (which itself is a very unclear statement), greatly limits the freedom of such work and opens the door for legal fuzziness and therefore possible bullying. The NC limitation appears most notably in two Creative Commons licenses: CC BY-NC-SA and CC BY-NC-ND; again, despite these licenses being Creative Commons, they are NOT free as in freedom licenses -- note that this is not an opinion or controversial statement, NC licenses very clearly break the consensual definition of free cultural works and Creative Commons themselves clearly state this is the case; they justify NC licenses as part of the proprietary-free license spectrum, standing somewhere in between "all rights reserved" and free cultural licenses. Even though to free culture newcomers NC licenses don't seem like such a big deal, they are in fact extremely harmful to free culture, DO NOT USE NC LICENSES. NC is similar (and similarly harmful) to another proprietary license limitation: ND (no derivatives allowed). If you use an NC license, you're a huge cocksucker.

Why are NC licenses bad? Firstly the Definition of Free Cultural Works project that maintains the widely accepted definition of free culture has an article on this: https://freedomdefined.org/Definition. Let us write a summary of the arguments ALONG WITH our own arguments:


nd

ND

See also NC.

In the context of licenses the acronym ND stands for no derivatives and means "no derivative works allowed" -- this is an unpopular limitation that makes such a license by definition proprietary (i.e. NOT a free cultural license). A work licensed under a license with ND clause -- most notably the CC BY-NC-ND license -- prohibits anyone from making derivative (i.e. modified) works from the original work, on grounds of copyright, which goes against one of the very fundamental ideas of free culture (and just any sane culture), that of free remixing, improvement, combining and reuse of art; it kills artistic freedom, culture, opens the door to legal bullying and strengthens the harmful capitalist idea of "intellectual property". All in all ND licenses are cancer, NEVER USE THEM.

The ND clause is similarly harmful to the NC (non-commercial-only) clause -- see the NC article for more detail.


needed

LRS: Projects Needed

WIP

Here is a list of some projects and project ideas which we, LRS, need to make in order to pursue our goals. The projects here are mostly basic things and tools that already exist in some form, but that have to be made from scratch according to LRS philosophy, i.e. in a KISS/suckless way, under public domain, in a good language (C, comun, ...) etc. This is kind of a dirty list serving some rough organization. If you have the skills and will (or know someone who does), you may take inspiration here, pick one up and make it, or contribute to some of the projects listed here. Also note that it's still possible to make multiple projects of the same type, e.g. you may still create another chess engine even though we already have one, just watch out that this is justified (it should offer something worth the extra effort).

what difficulty implementation by status comment similar/for now use
2D image editor mid? KISS GIMP clone needed! Use LRS GUI lib. Glorified MS paint? ped, GIMP, classic colors
2D raycasting engine (C) mid raycastlib drummyfish done
3D modelling software mid/hard? Blender clone needed! LRS GUI lib + small3dlib, just .obj files Blender :(
3D physics engine (C) hard tinyphysicsengine drummyfish done could use a true rigid body one too
3D raytracing library mid? had vague plans C lib for shooting 3D rays, allows raycast., RT, pathtr., ... POV-RAY
3D renderer (C) mid/hard small3dlib drummyfish done TinyGL, PortableGL
3D voxel renderer (C) mid/hard? like Ken Silverman's voxlap, looks very nice voxlap
artificial human language hard? thinking bout it need LRS lang., big problems with definitions of words tho, think Esperanto, Lojban, ...
audio/music editor mid/hard? for waveforms and/or MIDI (tracker music), can even be CLI/TUI Audacity, LMMS, ...
Arduino/Pokitto/... computer mid/hard? until we have PD computer, we'll need a nice tiny embedded comp.
BSP "pseudo 3D" renderer (C) mid/hard? for Anarch II? :) Doom engine, BUILD
chat software mid? dumbchat drummyfish one done make it KISS, no encryption, no Unicode, ... just chat! IRC
chatbot mid? plans in my head probably NOT neural net, KISS lib for good enough chatbot
chess engine/library (C) mid/hard smallchesslib drummyfish done it's not very strong tho :/
compression lib/util mid? shitpress/comunpress ... one so far
data, datasets easy/mid? can never have enough simple format CC0 data (CSV etc.): txt dictionaries, star DB, ... Wikidata, ...
dating/friend searching website mid? we are lonely + don't wanna use proprietary dating shit
free universes mid/hard? need at least one fantasy and one sci-fi, for games n shit
fiction, stories, books mid? have some plans fairytales, sci-fi from LRS society etc.
free cultural porn website mid? libre porn + suckless site (no JS), prev. attempts failed WMC porn, freedomporn.org
forum, chat, git/file host/mirror, ... easy/mid? for LRS community, if you have a server you could host something email, IRC
game engine/fantasy console (tiny) easy/mid SAF drummyfish done
game engine: point n click adventure mid
game: Doom clone hard Anarch drummyfish done Freedoom
game: GTA clone hard
game: Minecraft clone hard? Minetest is bloated as fuck, also bad license and SJWs Minetest :(
game: text adventure easy pure CLI text adventure, maybe "US citizen simulator"? :)
game: Trackmania clone hard Licar drummyfish started
game: Pokemon clone hard? catchable monsters game, procedurally generated ones? SAF? Tuxemon, ...
game: fantasy RPG hard? Dream: Elder Scrolls clone, also just a dungeon crawler, ...
games: tiny ones easy uTD, ... ... can never have enough very tiny games, SAF is ideal for this, nice learning project
go engine/library (C or comun) mid?
GUI library easy/mid like SAF but for "PC" GUI (mouse, sound, ...), now GUI's a mess
image/2D data library mid? C/comun lib for bitmaps (FFT, formats, ...), needs good planning
logic circuit library/simulator (comun) mid/hard? will be needed for PD computer
nice polished concise encyclopedia mid/hard? nice printable UNCENSORED encyclop. (clone of Larousse Desk E.)
neural net/other ML library hard? could use something KISS in pure C without needed python n shit nothing
PD computer very hard needs prerequisites done first (language, logic circ. lib., ...) Thinkpads :)
PD computer "operating system" mid? not now, will be more like Pokitto loader (see OS article)
propaganda materials easy can never have enough wallpapers, songs, videos, translations, tutorials, games, ...
programming language mid/hard comun drummyfish done, continuing C, comun, FORTH, ...
search engine mid/hard? like wiby, marginalia, ... support gopher, KISS (no DB, just txt) wiby, marginalia, ...
soundfonts easy/mid working on one nice CC0 soundfonts so we can make completely PD MIDI
text editor (C, comun) mid? likely more will be made, need a standard KISS editor in comun vim etc.
translation/dictionary software mid? Google translate alt., KISS, offline, even just word for word
vector fonts mid? GirlsAreDumb, ... ... one done nice CC0 fonts for texts, there are too few of those Aileron, GirlsAreDumb, ...
web (gopher, ...) browser easy/mid? like badwolf basically, but yet nicer (support gopher etc.) badwolf, netsurf, lynx, ...
wiki mid LRS wiki drummyfish done, continuing

netstalking

Netstalking

Netstalking means searching for obscure, hard-to-find and somehow valuable (even if only by its entertaining nature) information buried in the depths of the Internet (and similar networks), for example searching for funny photos on Google Streetview (https://9-eyes.com/), unindexed deepweb sites or secret documents on FTP servers. Netstalking is relatively unknown in the English-speaking world but is pretty popular in Russian communities.

Netstalking can be divided into two categories:

Techniques of netstalking include port scanning, randomly generating web domains, using advanced search queries and different search engines, searching caches and archives and obscure networks such as darknet or gopher.


neural_network

Neural Network

{ Not my field, learning on the go, watch out for errors! ~drummyfish }

In artificial intelligence a neural network (also neural net or just NN) is a system simulating natural biological neural network, i.e. a biological system found in living organisms, most importantly in our brain. Neural networks are just another kind of technology inspired by nature's ingenuity -- they try to mimic and simulate the naturally evolved structure of systems such as brain in hopes of making computers learn and "think" like living beings do, and in recent years they started achieving just that, with great success. Neural network are related to the term deep learning which basically stands for training multi-layered neural networks.

Even though neural networks absolutely aren't the only possible model used in machine learning (see e.g. Markov chains, k-NN, support vector machines, ...), they seem to be the most promising one -- nowadays neural networks are experiencing a boom and practically all AI research revolves around them; they already made their way from research to practice, not only do the play games such as chess on superhuman level, they already create extremely complex art and show some kind of understanding of pictures, video, audio and text on a human level (see chatGPT, stockfish, stable diffusion etcetc.), and even surpass humans at specialized tasks. Most importantly of course people use this for generating porn, see e.g. deepfakes. The exceptional results are already being labelled "scary" due to fears of technological singularity, "taking jobs", possible "unethical uses" etc.

Currently neural networks seem to be bringing back analog computing. As of 2023 most neural networks are still simulated with digital computers, but due to the fact that such networks are analog and parallel in nature the digital approach is inelegant (we make digital devices out of analog circuits and then try to make them behave like analog devices again) and inefficient (in terms of energy consumption). Therefore analog is making a comeback and researchers are experimenting with analog implementations, most notably electronic (classic electronic circuits) and photonic (optics-based) ones. Keep in mind that digital and analog networks are compatible; you can for example train a network digitally and then, once you've found a satisfying network, implement it as analog so that you can e.g. put it in a cellphone so that it doesn't drain too much energy. Analog networks may of course be embedded in digital devices (we don't need to go full analog).

Hardware acceleration of neural networks is being developed. Similarly to how GPUs arised to accelerate computer graphics during the 90s video game boom, similar hardware is arising for accelerating neural network computations -- these are called AI accelerators, notably e.g. Google's TPU (tensor processing unit). Currently GPUs are still mostly used for neural networks -- purely software networks are too slow. It is possible that future neural network hardware will be analog-based, as mentioned above.

Details

At the highest level neural network is just a black box with N real number inputs and M real number outputs. For example we may have input values such as age, height, weight, blood pressure, and two output values, one saying the expected years to live and the other one saying the confidence of this prediction. Inside this box there is network of neurons that we can train (adjust with different learning algorithms) so that it transforms the input values into output values in a correct way (i.e. here makes useful predictions).

Note that a traditional feed-forward neural network is just a network similar to e.g. a simple logic circuit, it is NOT a universal Turing complete model of computation like Turing machine or lambda calculus because it cannot for example perform loops or take arbitrarily sized input (the number of input values is fixed for given network). Neural network just takes the input numbers and in a certain fixed time (which theoretically doesn't depend on the input) runs it through the network to obtain the output numbers, i.e. it's best to view it as approximating a mathematical function rather than interpreting an algorithm. Of course, a neural network itself can be (and in practice is) embedded in a more complicated system that can do all the above, but in its simple form it's just a bunch of connections between inputs and outputs.

TODO

History

TODO


newspeak

Newspeak

Newspeak is a modified form of natural language (e.g. English) twisted for the purpose of thought control of mass population, with propaganda and ideology built in so as to affect thinking of people in a ways desired by the rulers of society. Newspeak was first described in the story of George Orwell's 1949 book called Nineteen Eighty Four and it is now being implemented in the real world, especially since about the end of 20th century, by the pseudoleft and capitalists (i.e. liberals). Refusing to use newspeak is labeled thought crime, wrongthink, hate speech or psychological disorder and punished either officially by government or unofficially by society-approved and state-tolerated lynching (so called cancelling). So called "hate speech" is now punishable by law in most first world countries.

Real world newspeak is characterized by banning certain keywords, for example so called slurs such as nigger, faggot or retard, as well as forcing political correctness ("gender neutral nouns", replacing "man" with "person" etc.), inventing euphemisms for harmful and oppressive concepts (copyright instead of copyrestriction, moderation instead of censorship etc.), redefining the meanings of existing terms such as racism (originally hating people of certain race, now anyone who disagrees with mainstream pseudoleft ideology), homophobia (originally hating gay people, now anyone who disagrees with pseudoleft ideology regarding LGBT), rape (originally sexual violence, now any action taken by man towards a woman) and inventing completely new terms such as hate speech and sanism serving for eliminating ideological opposition and suppression of free speech.

See Also


niger

Niger

Not to be confused with nigger.

Niger is an African country with a racist name.

How long before SJWs rename it LMAO?

See Also


niggercoin

Niggercoin

Niggercoin (abbreviated NGR) is a cryptocurrency invented by 4chan.


nigger

Nigger

Not to be confused with Niger.

Nigger (also nigga, niBBa, N-word or chimp) is a forbidden word that refers to a member of the black race, SJWs call the word a politically incorrect "slur". Its counterpart targeted on white people is cracker. To Harry Potter fans the word may be compared to the word Voldemort which everyone is afraid to say out of fear of being cancelled. Nigger is not to be confused with negro (which is a human race of only SOME black people, specifically those in central Africa).

Let us remind new readers that we, LRS, love all living beings, even black people <3 But we do not support political correctness.

The word is used in a number of projects, e.g.:

LMAO they're even censoring art and retroactively changing classical works of art to suit this newspeak, just like all previous oppressive regimes. E.g. Agatha Christie's book Ten Little Niggers was renamed to And Then There Were None. Are they also gonna repaint Mona Lisa when it somehow doesn't suit their liking?

LOL take a look at this website: http://niggermania.com. Also https://www.chimpout.com. Also this lol https://encyclopediadramatica.online/Niggers.


noise

Noise

Noise in general is an undesirable signal that's mixed in with useful signal and which we usually try to filter out, even though it can also be useful, especially e.g. in procedural generation. Typical example of noise is flickering or static in video and audio signals, but it can also take a form of e.g. random errors in transferred texts, irrelevant links in web search results or imperfections in measuring of economic data. In measurements we often talk about signal to noise ratio (SNR) -- the ratio that tells us how much noise there is in our data. While in engineering and scientific measurements noise is almost always something that's burdening us (e.g. cosmic microwave background, the noise left over from the Big Bang, may interfere with our radio communication), in some fields such as computer graphics (and other computer art) or cryptography we sometimes try to purposefully generate artificial noise -- e.g. in procedural generation noise is used to generate naturally looking terrain, clouds, textures etc.

 xxxx x    x  x    x  xx x xxx   x x x
 x        xxx  x x  x   xxxxxx     x
x xxxx x xxxx x  xxx  xx xxx xx    xxx
 xxx   xxx            xx  x xxx  x
x  x x  xxx x xxx  x x x xxx     x
  x xxx xx xxxxxx  xxx x xx x xx x
xxxxx x x x x x     x  xxxx xxx   x  x
xxxx    x  x  x xx  xx   xx  x    xx
      x  xxx  xxx x x   x xx  xx xxx
     xx xx  xxx x x xxx xxxxx  xxx x x
x  x xx x xxxx x xx xxx  x  x x  xx xx
 xx  xx    xxx x x xx  x    xx xx xx
   xx xx    x x x x xxx    xx   x xx x
xxx xx     xxxx x xx xx xxx x  x  x xx
xx x   xxx  x  xxx    xx x  x x     x
x x  xx x    x xxxxxx   x x  xxx
x      xxx     x x x x x x xx xxxxxxxx
x  xx x x xx x  xxxxxxxxx xxx  xxx  xx
x  xxxx xxx   x       x  x   xxx xxxxx
    xx   x  x x xxxxxx   x  xxx xxx

2D binary white noise

Artificial Noise

In computer science, especially e.g. in computer graphics or audio generation, we often want to generate artificial noise that looks like real noise, in order to be able to generate data that look like real world data. Noise may serve as a basis (first step, random initial state) for generating something or it may be used to distort or modulate other data, for example when drawing geometrical shapes we may want them to look as if written by hand in which case we want to make them imperfect, add a bit of noise to an otherwise perfect shape a computer would create. Adding noise can help make rendered pictures look more natural, it can help us model human speech that arises from noise created by our vocal cords, it can help AI train itself to filter out real life noise etc.

Normally we don't want our noise to be completely random but rather pseudorandom so that our programs stay deterministic. Imagine for example we are generating terrain heightmap based on a 2D noise -- if such function was truly random, the terrain in a certain place would be different every time we returned to that place again, but if the noise is pseudorandom (seeded by the position coordinates), the terrain generated at any given coordinate will be always the same.

There are different types of noise characterized by their properties such as number of dimensions, frequencies they contain, probability distribution of the values they contain etc. Basic division of noises may be this:

           ..----..  
        .''        ''.  
    ..''              '.                      ..... 
''''                    '                 ..''     '''--..    
                         '.            .''                ''..    
                           '.       ..'                       '''     
                             ''...''                           

                             octave1
                                +
       ....                                    ...
..--'''    ''.        ..'''...              .''   ''-..    ..--''
              ''---'''        '''''---...-''           ''''

                             octave2
                                +

''-....----'''-...-'''-...-'''''''-----...-'''--..-'''-....---'''

                             octave3
                                =
           ..
        .''  '.       
      -'       ''.-'-.  
... ..                '                      -'''--...
   '                   ''                   '         .       
                         '-.              -'           -.. ..  .'   
                            -.          -'                '  ''      
                              '...-.--''                        
                                  
                           fractal noise


1D fractal noise composed of 3 octaves


nokia

Nokia

TODO

Kids today will absolutely NOT remember what Nokia did with phones back then, the richness, amazing creativity, fun. Apple? WTF, that's like absolute nothing, Nokia was absolutely dominating the shit and coming up with INSANE designs you wouldn't believe. The guys must have been smoking some shit all day, they were like what haven't we done yet? Let's put the buttons all over the shit, let's make it round, let's make it fat, let's make the phone a literal square, let's make it a handheld console. And it worked, it was freakin art, everyone loved it, each phone was unique like, even with different GUI and everything. It was beautiful.

{ IMHO the peak of Nokia was 6600. Just look it up, I've never seen such beautiful design, AND it was such a high tech device back them, unreachable for a kid like me. Just the idea of having a real computer with an operating system in a pocket, with games, even a freaking video camera to shoot videos with, that was absolutely unreal to me. I wanted it so badly, I wanted it more than anything else. I had dreams about it. I also extremely enjoyed comparing it to the biggest competition Siemens SX1, that was a great device too. It was like a clash of two bosses. ~drummyfish }

{ Also I remember I was once kicked out of a cell phone shop because they had 6600 there for customers to try it out and I was spending hours playing some LOTR games on it xD I remember it like today, it kind of marked me. ~drummyfish }


no_knowledge_proof

No Knowledge Proof

{ I found the idea here https://suricrasia.online/no-knowledge.html, the page claims it comes from a Twitter user @chordowl. ~drummyfish }

In the context of cryptography no knowledge proof (NOT to be confused with zero knowledge proof) is a mathematical proof of not knowing certain information. At the moment it seems to be kind of a fun idea and curiosity without much use, but in math many fun ideas have found a serious use later on, so who knows. { If anyone knows of a legit use, let me know. ~drummyfish }

The principle is this: supposed we have a one way (practically irreversible) hash function H (such as SHA-256). Also suppose we have all agreed on a special value y that's non-zero and has been constructed so that it most likely doesn't have any malicious properties, i.e. it is a so called nothing up my sleeve value and can be for example some sentence converted to ASCII -- more detail on this will follow later, now simply suppose we have some value y. Now by providing someone with a number x we prove we don't know a value z such that h(z) = h(x) xor y.

How can this work? Well, imagine we knew z and we wanted to prove we didn't know it. We can compute h(x), (un)xor it with y, but now to compute x we'd have to reverse the hash h, i.e. compute x = h^(-1)(h(z) xor y). And from the definition of the hash we can't do this.

Another possible intuitive explanation: basically we have a system here that creates pairs of numbers -- for any two numbers x and z the system defines whether they're linked or not (via the equation h(z) = h(x) xor y). The point is that given one of these numbers it is practically impossible to derive the other one due to the difficulty of reversing the hash function, so if we show we know one number (x) we are really showing we don't know the other number (z) because we simply can't have both.

So the point of the joke is that we can't choose the information we want to prove we don't know, but that's kind of logical as if we could, we would know it. We simply pick a number x and so prove we don't know some random number z with some properties related to x and y.

A small vulnerability lies in the value y which is why it needs to be established carefully. We can't just accept someone else's x as a proof if it comes with a randomly looking y because then the proof may have been forged. How? Well, suppose there's been no y established; now again suppose we know some number z and want to prove we don't know it. We compute h(z), then we randomly choose some number x, compute h(x) and now we simply derive y as h(z) xor h(x). Now if someone accepts our y as valid, we have a forged fake proof as we know both the number x, i.e. a "proof" of not knowing z, but we also know z. So we have to make sure the one who is handing the proof (number x) did not pregenerate y this way. This opens up a possibility of attacking this proof by brute forcing various y values, then taking those that somehow look "innocent" (e.g. by resembling some string or simple number sequence) and then trying to establish those as a legit nothing up my sleeve value.

When thinking about it, it's not really that curious we can construct a no knowledge proof, in math we prove that we can't know something all the time (e.g. undecidable problems).


nord_vpn

NordVPN

NordVPN is a proprietary predatory scam service that steals personal data while trying to market itself as a "VPN for security and privacy"; it is useless at best and highly harmful to society at worst. It's a business similar to that of e.g. antiviruses, it builds on fear culture, privacy hysteria and lack of technological education, abusing people who have little to no knowledge of technology, stealing not just their money, but also their data, computing power etc. NordVPN furthermore utilizes all the unethical capitalist practice to make society as bad as possible, notably aggressive advertising, brainwashing and promotion of fear culture -- YouTube is infamous to have NordVPN propaganda inserted into every single video.


normalization

Normalization

In the context of mathematics normalization is a term that can mean slightly different things but generally it either refers to adjusting a set of values to some desired range by multiplying or dividing each of the values by some predetermined number, or to converting some data or expression into a unified format. The idea is to "tame" possibly very wildly differing values that we can encounter "in the wild" into something more "normal" that we can better work with. The term has also another meaning in the context of society and its culture. The following are some specific meanings of the term depending on context:


npc

NPC

NPC (non-player character) is a character in a video game that's not controlled by a player but rather by AI. NPC is also used as a term for people in real life that exhibit low intelligence and just behave in accordance with the system.


often_confused

Often Confused Terms

There are many terms that are very similar and can many times be used interchangeably. This isn't wrong per se, a slight difference may be insignificant in certain contexts. However it's good to know the differences for the sake of those cases where they matter. The following list tries to document some of the often confused/similar terms.


old

Old

Nowadays old correlates with better. For comparison of new and old see modern tech.


one

One

One (1, also a unit) is the first positive whole number, signifying the existence of a single unique thing we're counting.

Some facts about this number include:


oop

Object-Oriented Programming

"I invented the term 'object oriented' and C++ was not what I had in mind" --Alan Kay, inventor of OOP

Object-oriented programming (OOP, also object-obsessed programming and objectfuscated programming) is a programming paradigm that tries to model reality as a collection of abstract objects that communicate with each other and obey some specific rules. While the idea itself isn't bad and can be useful in certain cases, OOP has become extremely overused, extremely badly implemented and downright forced in programming languages which apply this abstraction to every single program and concept, creating anti-patterns, unnecessary issues and of course bloat. We therefore see OOP as a cancer of software development.

Ugly examples of OOP gone bad include Java and C++ (which at least doesn't force it). Other languages such as Python and Javascript include OOP but have lightened it up a bit and at least allow you to avoid using it.

You should learn OOP but only to see why it's bad (and to actually understand 99% of code written nowadays).

Principles

Bear in mind that OOP doesn't have a single, crystal clear definition. It takes many forms and mutations depending on language and it is practically always combined with other paradigms such as the imperative paradigm, so things may be fuzzy.

Generally OOP programs solve problems by having objects that communicate with each other. Every object is specialized to do some thing, e.g. one handles drawing text, another one handles caching, another one handles rendering of pictures etc. Every object has its data (e.g. a human object has weight, race etc.) and methods (object's own functions, e.g. human may provide methods getHeight, drinkBeer or petCat). Objects may send messages to each other: e.g. a human object sends a message to another human object to get his name (in practice this means the first object calls a method of the other object just like we call functions, e.g.: human2.getName()).

Now many OO languages use so called class OOP. In these we define object classes, similarly to defining data types. A class is a "template" for an object, it defines methods and types of data to hold. Any object we then create is then created based on some class (e.g. we create the object alice and bob of class Human, just as normally we create a variable x of type int). We say an object is an instance of a class, i.e. object is a real manifestation of what a class describes, with specific data etc.

The more "lightweight" type of OOP is called classless OOP which is usually based on having so called prototype objects instead of classes. In these languages we can simply create objects without classes and then assign them properties and methods dynamically at runtime. Here instead of creating a Human class we rather create a prototype object that serves as a template for other objects. To create specific humans we clone the prototype human and modify the clone.

OOP furthermore comes with some basic principles such as:

Why It's Shit

So Which Paradigm To Use Instead Of OOP?

After many people realized OOP is kind of shit, there has been a boom of "OOP alternatives" such as functional, traits, agent oriented programming, all kinds of "lightweight"/optional OOP etc etc. Which one to use?

In short: NONE, by default use the imperative paradigm (also called "procedural"). Remember this isn't to say you shouldn't ever apply a different paradigm, but imperative should be the default, most prevalent and suitable one to use in solving most problems. There is nothing new to invent or "beat" OOP.

But why imperative? Why can't we simply improve OOP or come up with something ultra genius to replace it with? Why do we say OOP is bad because it's forced and now we are forcing imperative paradigm? The answer is that the imperative paradigm is special because it is how computers actually work, it is not made up but rather it's the natural low level paradigm with minimum abstraction that reflects the underlying nature of computers. You may say this is just bullshit arbitrary rationalization but no, these properties makes imperative paradigm special among all other paradigms because:

Once computers start fundamentally working on a different paradigm, e.g. functional -- which BTW might happen with new types of computers such as quantum ones -- we may switch to that paradigm as the default, but until then imperative is the way to go.

History

TODO


openai

"Open"AI

OpenAI ("open" as in "not open") is a hugely harmful proprietary/closed-source for profit corporation researching and creating proprietary AI, known especially for practicing unethical capitalist deception techniques such as heavy openwashing (using the word open to make people think it is has something to do with "open source") and non-profit-washing (hiding behind a similarly named non-profit so that it seems the corporation is actually non-profit).


openarena

OpenArena

OpenArena (OA) is a first person pew pew arena shooter game, a free as in freedom clone of the famous game Quake 3. It runs on GNU/Linux, Winshit, BSD and other systems, it is quite light on system resources but does require a GPU acceleration (no software rendering). Quake 3 engine (Id tech 3) has retroactively been made free software: OpenArena forked it and additionally replaced the proprietary Quake assets, such as 3D models, sounds and maps, with community-created Japanese/nerd/waifu-themed free culture assets, so as to create a completely free game. OpenArena plays almost exactly the same as Quake 3, it basically just looks different and has different maps. It has an official wiki at https://openarena.fandom.com and a forum at http://www.openarena.ws/board/. OpenArena has also been used as a research tool.

As of 2023 most players you encounter in OA are cheating (many americans play it), but if you don't mind that, it's a pretty comfy game.

As of version 0.8.8 there are 45 maps and 12 game modes (deathmatch, team deatchmatch, capture the flag, last man standing, ...).

A bit of fun: you can redirect the chat to text to speech and let it be read aloud e.g. like this:

openarena 2>&1 | grep --line-buffered ".*^7: ^2.*" | sed -u "s/.*: ^2\(.*\)/\1/g" | sed -u "s/\^.//g" | espeak

Character art exhibits what SJWs would call high sexism. This is great, it's the good old 90s art style. The art is very nice and professional looking (no programmer art). Characters such as Angelyss are basically naked with just bikini strings covering her nipples. Other characters like Merman and Penguin (a typical "Linux user") are pretty funny. Ratmod has very nice taunts that would definitely be labeled offensive to gays and other minorities nowadays. The community is also pretty nice, free speech is allowed in chat, no codes of conduct anywhere, boomers thankfully don't buy into such bullshit. Very refreshing in the politically correct era.

OpenArena is similar to e.g. Xonotic -- another free arena shooter -- but is a bit simpler and oldschool, both in graphics, features and gameplay. It has fewer weapons, game modes and options. However there exist additional modifications, most notably the Ratmod (or RatArena) which makes it a bit more "advanced" (adds game modes, projectiles go through portals, improved prediction code etc.). As of 2022 an asset reboot in a more Anime style, called OA3, is planned and it seems to be aiming in the right direction -- instead of making a "modern HD game" (and so basically just remake Xonotic) they specifically set to create a 2000 game (i.e. keep the models low poly etc.). It could also help distinguish OpenArena from Quake more and so make it legally safer (e.g. in terms of trade dress).

{ I've been casually playing OA for a while alongside Xonotic. I love both games. OA is more oldschool, boomer, straightforward and KISS, feels slower in terms of movement and combat but DM matches are much quicker, fragging is very rapid. ~drummyfish }

The project was established on August 19 2005, one day after the Quake 3 engine was freed.

See Also


open_console

Open Console

{ Open consoles are how I got into suckless programming, they taught me about the low-level, optimizations and how to actually program efficiently on very limited hardware. I recommend you grab one of these. ~drummyfish }

Open consoles (also indie handhelds etc.) are tiny GameBoy-like gaming consoles mostly powered by free software and free hardware, which have relatively recently (some time after 2015) seen a small boom. Examples include Arduboy, Pokitto or Gamebuino. These are NOT to be confused with the Raspberry Pi (and similar) handhelds that run GameBoy/PS1/DOS emulators but rather custom, mostly FOSS platforms running mostly their own community made homebrew games.

In summary, open consoles are:

Recommended consoles for starters are Arduboy and Pokitto which are not only very well designed, but most importantly have actual friendly active communities.

These nice little toys are great because they are anti-modern, simple, out of the toxic mainstream, like the oldschool bullshit-free computers. This supports (and by the low specs kind of "forces") suckless programming and brings the programmer the joy of programming (no headaches of resizable windows, multithreading etc., just plain programming of simple things with direct access to hardware). They offer an alternative ISA, a non-x86 platform without botnet and bloat usable for any purpose, not just games. Besides that, this hobby teaches low level, efficiency-focused programming skills.

Programming

Open consoles can typically be programmed without proprietary software (though officially they may promote something involving proprietary software), GNU/Linux mostly works just fine (sometimes it requires a bit of extra work but not much). Most of the consoles are Arduino-based so the Arduino IDE is the official development tool with C++ as a language (C being thankfully an option as well). The IDE is "open-source" but also bloat; thankfully CLI development workflow can be set up without greater issues (Arduino comes with CLI tools and for other platforms gcc cross-compiler can be used) so comfy programming with vim is nicely possible.

If normies can do it, you can do it too.

Some consoles (e.g. Arduboy, Pokitto and Gamebuino META) have their own emulators which make the development much easier... or rather bearable. Without an emulator you're forced to constantly reupload the program to the real hardware which is a pain, so you want to either use a nice LRS library such as SAF or write your game to be platform-independent and just make it run on your development PC as well as on the console (just abstract the I/O and use SDL for the PC and the console's library for the console -- see how Anarch does it).

Open Console List

Some notable open consoles (which fit the definition at least loosely) are listed here. Symbol meaning:

name CPU RAM (K) ROM (K) display notes
Arduboy 8b 16 MHz 2.5 32 64x32 1b * A C +, tiny
Gamebuino 8b 16 MHz 2 32 84x48 1b + A -, SD
Pokitto 32b 48 MHz 36 256 220x176 * C +, ext. hats, SD
ESPboy 32b 160 MHz 80 4000 128x128 A
GB META 32b 48 MHz 32 256 168x120 A + -, SD
Nibble 32b 160 MHz 80 4000 128x128 A, AAA bat.
UzeBox
Tiny Arcade
Thumby

TODO: Retro Game Tiny, Adafruit PyGamer, ... see also https://github.com/ESPboy-edu/awesome-indie-handhelds

See Also


open_source

Open $ource

"Micro$oft <3 open $ource"

Open source (OS) is a capitalist movement/brand forked from the free software movement; it is advocating "openness", sharing and collaboration in software and hardware development and though legally it is mostly identical to free (as in freedom) software, in practice and in spirit it is very different by abandoning the goal of freedom and ethics in favor of business (to which ethics is an obstacle), due to which we see open source as inherently evil and recommend following the free software way instead. Richard Stallman, the founder of free software, distances himself from the open source movement. Fascist organizations such as Microsoft and Google, on the other hand, embrace open source (while restraining from using the term free software) and slowly shape it towards their goals. The term FOSS is sometimes used to refer to both free software and open source without expressing any preference.

Open source is unfortunately (but unsurprisingly) becoming more prevalent than free software, as it better serves capitalism and abuse of people, and its followers are more and more hostile towards the free software movement. This is very dangerous, ethics and focus on actual user freedom is replaced by shallow legal definitions that can be bypassed, e.g. by capitalist software and bloat monopoly. In a way open source is capitalism reshaping free software so as to weaken it and eventually make its principles of freedom ineffective. Open source tries to shift the goal posts: more and more it offers only an illusion of some kind of ethics and/or freedom, it pushes towards mere partial openness ("open source" for proprietary platforms), towards high complexity, inclusion of unethical business-centered features (autoupdates, DRM, ...), high interdependency, difficulty of utilizing the rights granted by the license, exclusion of developers with "incorrect" political opinions or bad brand image etc. In practice open source has become something akin a mere brand which is stick to a piece of software to give users with little insight a feeling they're buying into something good -- this is called openwashing. This claim is greatly supported by the fact that corporations such as Microsoft and Google widely embrace open source ("Microsoft <3 open source", the infamous GitHub acquisition etc.).

One great difference of open source with respect to free software is that open source doesn't mind proprietary dependencies: Windows only programs or games in proprietary engines such as Unity are happily called open source -- this would be impossible in the context of free software because as Richard Stallman says software can only be free if it is free as a whole, it takes a single proprietary line of code to allow abuse of the user. The "open source" communities nowadays absolutely don't care a bit about freedom or ethics, many "open source" proponents even react aggressively to bringing the idea of ethics up. "Open source" communities use locked, abusive proprietary platforms such as Discord and Micro$oft's GitHub to create software and collaborate -- users without Discord and/or GitHub account often aren't even offered a way to contribute, report bugs or ask for support.

The open source definition is maintained by the Open Source Initiative (OSI) -- they define what exactly classifies as open source and which licenses are compatible with it. These licenses are mostly the same as those approved by the FSF (even though not 100%). The open source definition is a bit more complex than that of free software, in a nutshell it goes along the lines:

  1. The license has to allow free redistribution of the software without any fees.
  2. Source code must be freely available, without any obfuscation.
  3. Modification of the software must be allowed as well as redistribution of these modified versions under the same terms as the original.
  4. Direct modification may be forbidden only if patches are allowed.
  5. The license must not discriminate against people, everyone has to be given the same rights.
  6. The license must not discriminate against specific uses, i.e. use for any purpose must be allowed.
  7. The license applies automatically to everyone who receives the software with the license.
  8. The license must apply generally, it cannot be e.g. limited to the case when the software is part of some larger package.
  9. The license must not restrict other software, i.e. it cannot for example be forbidden to run the software alongside some other piece of software.
  10. The license must be technology neutral, i.e. it cannot for example limit the software to certain platform or API.

See Also


operating_system

Operating System

Operating System (OS) is normally a hugely complex program that's typically installed before any other program and serves as a platform for running other programs as well as managing resources (CPU usage, RAM, files, network, ...) and offering services and interfaces for humans and programs.

There is a nice CC0 wiki for OS development at https://wiki.osdev.org/.

From programmer's point of view a serious OS is one of the most difficult pieces of software one can pursue to develop. The task involves an enormous amount of low-level programming, development of own tools from scratch and requires deep and detailed knowledge of all components of a computer, of established standards as well as many theoretical subjects such as compiler design.

Which OS is the best? Currently there seems to be no good operating system in existence, all are just too far away from LRS ideals, however there are quite a few relatively usable ones, mostly Unix like systems. For example OpenBSD seems to be one of them, however it is proprietary (yes, it contains some code without license) and too obsessed with MUH SECURITY. HyperbolaBSD at least tries to address the freedom issue of OpenBSD. Devuan is pretty usable, just werks and is alright in not being an absolute apeshit of consoomerist bloat. FreeDOS seemed nice too: though it's not Unix like, it is much more KISS than Unices.

An OS, as a software, consists of two main parts:

For example in GNU/Linux, Linux is the kernel and GNU is the userland.

Attributes/Features

TODO

Notable Operating Systems

Below are some of the most notable OSes.

LRS Operating System

What would an operating system designed by LRS principles look like? There may be many different ways to approach this challenge. Multiple operating systems (or multiple versions of the same system) may be made, such as as an "extremely KISS bare minimum featureless system", a "more advanced but still KISS system", a "special-purpose safe system for critical uses" etc. The following is a discussion of ideas we might employ in designing such systems.

The basic idea for a universal LRS operating system is to be something akin DOS plus a bit of Unix philosophy. The system would probably seem primitive by "modern standards", but in a good society it would be sufficient as a universal operating system (i.e. not necessarily suitable for ALL purposes). The OS would in fact be more of a program loader (like e.g. the one seen in Pokitto), running with the same privileges as other programs -- its purpose would NOT be to provide a safe environment for programs to run in, but rather to allow switching between different programs on a computer without having to reupload the programs externally, and to provide basic tools for managing the computer itself (such as browsing files, testing hardware etc.).

Let's keep in mind that true LRS computers would be different from the current capitalist ones -- an operating system would only be optional, programs would be able to run on bare metal as well as under an OS, and operating systems would be as much compatible as possible. By this an OS might be seen as more of an extra tool rather than a platform.

The system might likely lack features one would nowadays call essential for an OS, such as multiple user support (no need if everyone has his own computer and security is not an issue), virtual memory, complex GUI etc. There might even be no multitasking; a possibility to make a multitasking OS exists, but let's keep in mind that even such things as programs interacting via pipes may be somewhat implemented without it (using temporary buffer files into which one program's output is stored before running the next program).

The universal OS would assume well behaved programs, as programs would likely be given full control over the computer when run -- this would greatly simplify the system and also computing in general. Doing so would be possible thanks to non-existence of malicious programs (as in good society there would be no need for them) and elimination of update culture. Users would only install a few programs they choose carefully -- programs that have been greatly tested and don't need to be updated.

On user interface: the basic interaction mode would of course be the text interface. Programs would have the option to switch to a graphical mode in which they would be able to draw to screen. There would be no such bloat as window managers or desktop environments -- these are capitalist inventions that aren't really needed as users practically always interacts with just one program at a time. Even in a multitasking system only one program would be drawing to the screen at a time, with user having the option to "alt-tab" between them. This would also simplify programs greatly as they wouldn't have to handle bullshit such as dynamically resizing and rearranging their window content. If someone REALLY wanted to have two programs at the screen at the same time, something akin a "screen splitter" might be made to create two virtual screens on one physical screen.

A bit more technical details: the universal OS could simply be a program that gets executed after computer restart. This program would offer a shell (textual, graphical, ...) that would allow inspecting the computer, configuring it, and mainly running other programs. Once the user chose to run some program, the OS would load the program to memory and jump to executing it. To get back to the OS the program could hand back control to the OS, or the computer could simply be restarted. If the program crashes, the computer simply restarts back to OS. The OS could also contain functions, kind of an API, as part of its code that "userland" programs might call, e.g. for drawing to the screen, reading input etc. (just as e.g. a BIOS usually offers similar functions). Note that shell scripting that executes other programs could still be possible, e.g. by remembering the state of script execution in a file on disc. Same thing could be used for communicating between the OS and programs (e.g. clipboard), i.e. there would still be no need for multitasking.

TODO: more


optimization

Optimization

Optimization means making a program more efficient in terms of consumption of some computing resource or by any similar metric, commonly aiming for greater execution speed or lower memory usage (but also e.g. lower power consumption, lower network usage etc.) while preserving how the program functions externally. Unlike refactoring, which aims primarily for a better readability of source code, optimization changes the inner behavior of the executed program to a more optimal one.

General Tips'N'Tricks

These are mainly for C, but may be usable in other languages as well.

When To Actually Optimize?

Nubs often ask this and this can also be a very nontrivial question. Generally fine, sophisticated optimization should come as one of the last steps in development, when you actually have a working thing. These are optimizations requiring significant energy/time to implement -- you don't want to spend resources on this at the stage when they may well be dropped in the end, or they won't matter because they'll be outside the bottleneck. However there are two "exceptions".

The highest-level optimization is done as part of the initial design of the program, before any line of code gets written. This includes the choice of data structures and mathematical models you're going to be using, the very foundation around which you'll be building your castle. This happens in your head at the time you're forming an idea for a program, e.g. you're choosing between server-client or P2P, monolithic or micro kernel, raytraced or rasterized graphics etc. These choices affect greatly the performance of your program but can hardly be changed once the program is completed, so they need to be made beforehand. This requires wide knowledge and experience as you work by intuition.

Another kind of optimization done during development is just automatically writing good code, i.e. being familiar with specific patterns and using them without much thought. For example if you're computing some value inside a loop and this value doesn't change between iterations, you just automatically put computation of that value before the loop. Without this you'd simply end up with a shitty code that would have to be rewritten line by line at the end. Yes, compilers can often do this simple kind of optimization for you, but you don't want to rely on it.

See Also


os

OS

OS can stand for either operating system or open source.


palette

Palette

In computer graphics palette is a set of possible colors that can be displayed, the term usually refers to a selected smaller subset of all colors that can in theory be displayed (large sets of colors tend to be called color spaces rather than palettes). Nowadays mainstream computers are powerful enough to work with over 6 million 24bit RBG colors (so called True Color) practically without limitations so the use of palettes is no longer such a huge thing, but with resource-limited machines, such as embedded devices and older computers, the use of palettes is sometimes necessary or at least offers many advantages (e.g. saving a lot of memory). Nevertheless palettes find uses even in "modern" graphics, e.g. in the design of image formats that save space. Palettes are also greatly important in pixel art as an artistic choice.

Palettes usually contain a few to few thousand colors and the number is normally a power of 2, i.e. we see palettes with number of colors being 8, 16, 256, 2048, etc. -- this has advantages such as efficiency (fully utilizing color indices, keeping memory aligned etc.). Palettes can be general purpose or specialized (for example some image formats such as GIF create a special palette for every individual image so as to best preserve its colors). Palettes can also be explicitly stored (the palette colors are stored somewhere in the memory) or implicit (the color can somehow be derived from its index, e.g. the 565 palette).

Palettes are related to screen modes -- systems that work with palettes will usually offer to set a specific screen mode that defines parameters such as screen resolution and number of colors we can use, i.e. the number of colors of our palette (we can normally set the colors in a palette). Modes that make use of palettes are called indexed because each pixel in memory is stored as an index to the palette (for example if we have a palette {red, yellow, white}, a pixel value 0 will stand for red, 1 for yellow and 2 for white) -- the palette serves as a color look-up table (CLUT). Non-indexed modes on the other hand store the color directly (i.e. there will typically be a direct RGB value stored for each pixel). We can see that an indexed mode (i.e. choosing to use a palette) will save a lot of memory for the framebuffer (VRAM) thanks to reducing the number of bits per pixel: e.g. when using an 8 bit palette, storing each pixel (index) will take up 1 byte (8 bits, 256 colors) while in a non-indexed 24 bit RGB mode (over 6 million colors) each pixel will take 3 bytes (24 bits), i.e. three times as much. The same goes for using bigger palettes: e.g. using a 16 bit palette (65536 colors) will take four times as much memory for storing pixels than a 4 bit palette (16 colors). Note that even in indexed modes we may sometimes be able to draw pixels of arbitrary color with so called direct writes to the display, i.e. without the color being stored in framebuffer. With palettes we may see the use of dithering to achieve the illusion of mixing colors.

Using palettes has also more advantages, for example we can cycle the palette colors or quickly switch it for another palette and so e.g. increase contrast or apply some color effect (this trick was used e.g. in Doom). Palettes can be constructed in clever ways (for example in Anarch) so that it is e.g. easy to make a color brighter or darker by simply incrementing or decrementing its index (while increasing brightness of a three-component RGB value is complex and slow) -- as we generally process big numbers of pixels this can lead to tremendous speed ups. Having fewer colors also makes them easier to compare and so easily implement things such as pixel art upscaling (huge number of colors generally forces us to compare pixels with some amount of bias which is slower).

Can palettes be copyrighted? We hope not, that would indeed be pretty fucked up, however trademarks already allowed corporations to own even single colors (Milka chocolate's lilac color), and some websites for sharing palettes claim that a picture of a palette can be copyrighted as some kind of a "digital painting", even though they acknowledge a set of colors as such can't be copyrighted. So for maximum safety try to create your own palette (and share it under CC0 to spare others the same pain) as a first option, as a second option use a palette that's explicitly shared under free terms (CC0 is probably best), and if you absolutely have to reuse someone else's "proprietary" palette, at least reorder its colors and possibly slightly change the RGB values to make it a bit distinct.

Examples

Example of a basic 8 color palette may be (the color notation is in hexadecimal #rrggbb format):

#000000 #808080 #ffffff #ff0000 #00ff00 #0000ff #ffff00 #00ffff
black   gray    white   red     green   blue    yellow  cyan

The following is a general purpose 256 color palette made by drummyfish and used in Anarch. It is based on HSV model: it divides colors into 4 saturations, 10 or 11 hues and 8 levels of value ("brightness") which can easily be changed by incrementing/decrementing the color index (which in Anarch was exploited for ligtening up and darkening textures depending on distance).

#000000 #242424 #494949 #6d6d6d #929292 #b6b6b6 #dbdbdb #ffffff
#201515 #402a2a #604040 #805555 #a06a6a #c08080 #e09595 #ffaaaa
#201b15 #40372a #605240 #806e55 #a08a6a #c0a580 #e0c195 #ffdcaa
#1d2015 #3b402a #596040 #778055 #95a06a #b3c080 #d1e095 #edffaa
#172015 #2f402a #466040 #5e8055 #75a06a #8dc080 #a5e095 #bcffaa
#152019 #2a4033 #40604c #558066 #6aa080 #80c099 #95e0b3 #aaffcc
#15201f #2a403f #40605f #55807f #6aa09f #80c0bf #95e0df #aafffe
#151920 #2a3340 #404c60 #556680 #6a80a0 #8099c0 #95b3e0 #aaccff
#171520 #2e2a40 #464060 #5d5580 #746aa0 #8c80c0 #a395e0 #b9aaff
#1d1520 #3b2a40 #594060 #775580 #956aa0 #b380c0 #d195e0 #eeaaff
#20151b #402a37 #604053 #80556f #a06a8b #c080a7 #e095c3 #ffaadd
#200a0a #401515 #602020 #802a2a #a03535 #c04040 #e04a4a #ff5555
#20170a #402e15 #604520 #805c2a #a07435 #c08b40 #e0a24a #ffb955
#1b200a #374015 #536020 #6e802a #8aa035 #a6c040 #c2e04a #dcff55
#f200a0 #1e4015 #2d6020 #3c802a #4ba035 #5bc040 #6ae04a #79ff55
#a20130 #154026 #206039 #2a804c #35a060 #40c073 #4ae086 #55ff99
#a201f0 #15403f #20605f #2a807e #35a09e #40c0be #4ae0de #55fffd
#a13200 #152640 #203960 #2a4c80 #3560a0 #4073c0 #4a86e0 #5599ff
#e0a200 #1d1540 #2c2060 #3a2a80 #4935a0 #5840c0 #664ae0 #7455ff
#1b0a20 #371540 #532060 #6e2a80 #8a35a0 #a640c0 #c24ae0 #dd55ff
#200a17 #40152f #602047 #802a5e #a03576 #c0408e #e04aa6 #ff55bc
#200000 #400000 #600000 #800000 #a00000 #c00000 #e00000 #ff0000
#201100 #402200 #603300 #804500 #a05600 #c06700 #e07900 #ff8a00
#1d2000 #3a4000 #586000 #758000 #92a000 #b0c000 #cde000 #eaff00
#c20000 #184000 #246000 #308000 #3ca000 #48c000 #54e000 #60ff00
#200500 #400a00 #600f00 #801500 #a01a00 #c01f00 #e02400 #ff2900
#201600 #402d00 #604300 #805a00 #a07000 #c08700 #e09e00 #ffb400
#172000 #2e4000 #466000 #5d8000 #74a000 #8cc000 #a3e000 #baff00
#620000 #c40000 #126000 #188000 #1ea000 #24c000 #2ae000 #30ff00
#b00200 #160040 #210060 #2d0080 #3800a0 #4300c0 #4f00e0 #5900ff
#1c0020 #390040 #550060 #720080 #8f00a0 #ab00c0 #c800e0 #e400ff
#200012 #400024 #600036 #800048 #a0005a #c0006c #e0007e #ff008f

Other common palettes include RGB332 (256 colors, one byte represents RGB with 3, 3 and 2 bits for R, G and B) and RGB565 (65536 colors, two bytes represent RGB with 5, 6 and 5 bits for R, G and B).

See Also


patent

Patent

Patent is a form of extreme "intellectual property" that allows owning useful ideas, oppressing and bullying people and preventing others from using ideas -- software patents are especially harmful to society and technology. Patents are currently along with copyright likely the most harmful kind of "intellectual property" in technology -- even though copyright is probably a more pressing issue at the moment because it is the most common form of IP oppression, patents can be just as harmful in individual cases. Of course we're not even talking about the whole gigantic bullshit bureaucracy and business connected to patents that just wastes man centuries of effort. Examples of patents in software are minigames on loading screens in games (this patent has already expired), shadow volume algorithm for rendering shadows, mp3 format (also expired), various compression techniques, even such broad ideas as public key encryption (yes, the whole idea that's the basis of cryptography was patented and unusable until 1977) etc.

There is an article on software patents at https://www.gnu.org/philosophy/software-patents.en.html. There is even a site and initiative dedicated to ending software patents at https://wiki.endsoftwarepatents.org/wiki/Main_Page.

Patents are kind of similar to but also very different from copyright (Richard Stallman stressed the differences and says it is dangerous to think of copyright and patents as similar): while copyright applies to art and is granted automatically, patents apply to ideas (which should ideally be new inventions but in practice can be just any trivially stupid ideas), have to be registered and are kept recorded somewhere. Patents also last a shorter time than copyright (generally 20 years as opposed to copyright's lifetime plus 70 years) and are territorial, i.e. not world-wide. These facts make patents a bit less disastrous than copyright, however they still cause a great deal of damage -- not only do they prevent technological progress (a new ideas such as a new efficient algorithm is simply prohibited to be used by anyone but it's "owner" and those who the owner sells a license), they also allow so called patent trolling (patent scams) -- patent trolling takes advantage of the fact that it is practically impossible to safely check if some idea is not patented, i.e. safe to use. There exist troll companies whose sole business is to register trivial patents and then sue random people who unknowingly implement this idea in their projects (there is e.g. a famous video about how this happened to the developer of X-plane, trolled by Uniloc company that had patented the idea of using a "play store" to distribute programs) -- the companies often bully developers to off court settlement for paying a lower free but this includes a contract that prevents the affected developers from talking about this.

Granting and checking patents is also becoming progressively more difficult, expensive and sometimes basically impossible, as any new filed patent has to be checked for how "innovative" it is. This means someone has to literally go through all ideas ever invented in computer science (impossible even for the biggest brain on the planet) and check if the new submited idea is really new -- given that computer science progresses by lightning speed, every day it is becoming more and more difficult to check patents. As time for checking a patent is limited, the result is many false positives, errors and grants of patents on trivial or non-innovative ideas, which has disastrous consequences. And of course, we're not even talking about corruption -- patents are highly lucrative and it would be naive to believe there are no cases of someone just buying a patent grant.

Many (probably most) free software proponents, and just many programmers in general, including for example Richard Stallman, John Carmack or Donald Knuth, have highly criticized the existence of software patents. Richard Stallman himself has been warning of the dangers and has likened the world of patents to a mine field because when you're programming, you have no idea whether an idea you get and implement in your program isn't in fact "owned" by anyone, programming itself poses risk of stepping on mines (patents).

As a good free software developer you should use licenses/waivers to get rid of patents! Similarly to copyright, your software should come with a license or waiver that ensure patents won't prevent others from exercising the four essential freedom rights, i.e. there should be a legal document that says you grant others rights to any of your patented ideas hiding in your source code so that others are safe from you suing them if they reuse your potentially patent-infected code (still, there may unfortunately be hiding patents from third parties which cannot be addressed). Some licenses, such as GPL or Apache include patent grants, however others such as MIT or CC0 don't or have to be slightly modified to do so. This is an issue because there is for example no nice way of dedicating one's work completely to the public domain complete with patent grants, as CC0, Unlicense and WTFPL don't address the patent issue -- with these an extra patent waiver has to be manually added! Unlike with copyright, patent waivers aren't always completely necessary, it is very possible that in many simple and non-innovative projects there are no patented ideas, however one can never be sure, so it is better to use a patent waiver just in case, one can never go wrong by including it.

Which patent waiver to use? You may for example copy-paste the waiver from our own wiki.

Some patents are fun and bullshit, e.g. there exist bizarre patents that claim to achieve impossible things such as perpetuum mobile or infinitely efficient compression of random data (nicely analyzed at http://gailly.net/05533051.html).

See Also


paywall

Paywall

BUY PREMIUM MEMBERSHIP TO READ THIS ARTICLE


pd

PD

PD stands for public domain.


pedophilia

Pedophilia

Love is not a crime.

{ I hate disclaimers but I'm getting some suicide suggestions and death threats, so I'll leave a small note here: keep in mind LRS loves all living beings and never advocates for hurting anyone, i.e. rape of anyone is absolutely not acceptable, as any other kind of violence against any living being -- this is what really matters in the end (as opposed to respecting arbitrary law-imposed age limits etc.). Any thought, desire, perception or sharing of any information must however never be considered wrong in itself, i.e. bullying someone merely for his sexual orientation or his thoughts is just as wrong as raping someone. LRS is one the most peaceful philosophies in history.

Most people I talk to about this article privately tell me they basically agree with everything I write here, but they say I "shouldn't be saying this aloud". Well, what kind of fucked up society is this when I can't tell a truth everyone knows? What kind of medieval thinking is this, do we really live in such a dystopian horror already? Fuck this shit and fuck your silence, I wanna puke from your conformance to evil.

I have not once now encountered groups of people who tried to seriously push me to committing suicide, simply for advocating not bullying people for a private desire, knowing very well I had suicidal tendencies and that I would never harm anyone, nor would I advocate any kind of harm of anyone -- not random strangers, but people who knew me for long. There is literally no difference from a witch hunt now. This is the kind of people you want to be? Just think about it for a second.

love & peace ~drummyfish }

Pedophilia (also paedophilia or paedosexuality) is a sexual orientation towards children. A pedophile is often called a pedo or minor-attracted person (map). Opposition of pedophilia is called pedophobia or pedohysteria and is a form of age discrimination and witch hunt.

Unlike for example pure homosexuality, pedophilia is completely natural, normal and not any more harmful than any other orientation, however it is nowadays wrongfully, for political reasons, labelled a "disorder" (just as homosexuality used to be not a long time ago). It is the forbidden, tabooed, censored and bullied sexual orientation of the 21st century, even though all healthy people are pedophiles -- just don't pretend you've never seen a jailbait you found sexy, people start being sexually attractive exactly as soon as they become able to reproduce; furthermore when you've gone without sex long enough and get extremely horny, you get turned on by anything that literally has some kind of hole in it -- this is completely normal. Basically everyone has some kind of weird fetish he hides from the world, there are people who literally fuck cars in their exhausts, people who like to eat shit, dress in diapers and hang from ceiling by their nipples, people who have sexual relationships with virtual characters etc. -- this is all considered normal, but somehow once you get an erection seeing a hot 17 year old girl, you're a demon that needs to be locked up and cured, if not executed right away, just for a thought present in your mind.

Even though one cannot choose this orientation and even though pedophiles don't hurt anyone any more than for example gay people do, they are highly oppressed and tortured. Despite what the propaganda says, a pedophile is not automatically a rapist of children (a pedophile will probably choose to never actually even have sex with a child) any more than a gay man is automatically a rapist of people of the same sex, and watching child porn won't make you want to rape children any more than watching gay porn will make you want to rape people of the same sex. Nevertheless the society, especially the fascists from the LGBT movement who ought to know better than anyone else what it is like to be oppressed only because of private sexual desires, actively hunt pedophiles, bully them and lynch them on the Internet and in the real life -- this is done by both both civilians and the state (I shit you not, in Murica there are whole police teams of pink haired lesbians who pretend to be little girls on the Internet and tease guys so that they can lock them up and get a medal for it). LGBT activists proclaim that a "child can't consent" but at the same time tell you that "a prepubescent child can make a decision about changing its sex" (yes, it's happening, even if parent's agreement is also needed, would parents also be able to allow a child to have sex if it wishes to?). There is a literal witch hunt going on against completely innocent people, just like in the middle ages. Innocent people are tortured, castrated, cancelled, rid of their careers, imprisoned, beaten, rid of their friends and families and pushed to suicide sometimes only for having certain files on their computers or saying something inappropriate online (not that any of the above is ever justified to do to anyone, even the worst criminal).

{ I've had people point out to me that pedophobia hurts not only adults but also the minors and children; they told me they had strong sexual desires before the age of 18 they couldn't satisfy because of the age discrimination: even on many social networks they are forced to lie about their age just to be able to join and socialize with others. I myself remember I had the desires LONG before reaching adulthood and would be very glad to satisfy them back then. Sure, abuse can happen, but that's the case for any interaction between children and adults and strong and weak in general -- should we just ban children play parks because that's where many child abductions happen? ~drummyfish }

The fact that they made people believe it is a disorder if your penis can't magically telepathically check a chick's ID and may get erect if she's been born before a date legally established in political region the penis currently resides in shows that at this point an average citizen is more retarded than a braindead chimp.

Child porn is hardcore censored on the mainstream Internet, it is forbidden to even posses for personal use (!!!) -- even if you don't pay for it, even if you don't show it to anyone, even if you're not redistributing it, even if you're not hurting anyone, even if you don't even watch it, you're a criminal just if a file of an underage PP resides on your harddrive. The anti-pedo craze has gotten so insanely and unbelievably bad that even cartoon pictures of naked children or photos of children in swimsuits (not even talking about non-sexual photos of naked children) are banned basically everywhere on the internet :D WTF. LMAO they even blur just faces of children on TV. Let's repeat that, children faces are censored in today's society xD The worst part is that most people comply with such censorship and even support it, it's unbelievable how fucked up the world is.

The pedophile witch hunt exists because it is a great political tool. It is an arbitrarily invented (well, maybe not invented but purposefully escalated) victimless crime. By the principles of fear culture, it allows to push things such as hard surveillance and censorship, similarly to e.g. "war on terror". You're a government or a corporation and want to spy on people chatting? Just make a law requiring mandatory spyware in all chat and justify it by "pedophiles" (this is what EU did). You're against the surveillance law? You must be a pedophile! The witch hunt also allows to immediately cancel anyone uncomfortable. There's a guy who the government doesn't like? Maybe a political competition. Simple, just plant some files on his computer, make up a little story and he's gone.

Defending pedophilia in itself is enough to be cancelled, perhaps even imprisoned or killed by the angry mob, however it is the morally right thing to always say the truth -- especially that which is being censored. Therefore we mustn't remain silent about this issue.


people

Tech People

"People are retarded." --Osho

Here is a list of people notable in technology or otherwise (and drummyfish :]).


phd

PhD

PhD (also Ph.D., PhD. etc.), or doctor of philosophy, written after the name, is the highest academic degree that can be earned by being a student in University, the basic title required for working as a scientist. It is earned through many years of study and especially active publishing of original research that pushed the boundary of current human knowledge in a specific field. Despite being called doctor of philosophy, the title is awarded generally to scientists in basically any field such as mathematics, physics, psychology, chemistry etc., NOT just to those studying philosophy. PhD is yet above master's degree. It is a doctorate degree, so a holder of PhD is called a doctor (Dr.), just as those with other forms of doctorates such as medical doctorate or honorary doctorate; however PhD is the big doctorate, the kind of highest, most prestigious one. People with a PhD degree are considered the foremost experts, the smartest, most educated elite, as only about 1 to 2 % of population hold a PhD, though PhD is also often considered an overkill and an overqualitfication (there are many cases of people with PhD not mentioning it on their CVs because such a high education can actually be a disadvantage), and of course, as with everything under capitalism, PhDs became a thing of business and conformance, subject to corruption and degradation (there now even exist PhDs in astrology, gender studies etc.), at times even a meme. Yes, one has to be quite smart and talented to obtain a PhD, but nowadays it's probably more about pouring an extreme amount of energy, slavery and conformance to the corrupt academic cults, so the prestige of the title comes for a pretty high price, one often not worth paying.

TODO

Should you get a PhD? Probably not -- as said the sacrifice required is enormous, to make it you should have a REAL GOOD reason, of which there aren't many -- perhaps if you REALLY want to be teacher at University or if for some twisted reason you want to spend your whole life in the corrupt toxic soyence environment trying to prove women are better than men and sucking capitalist dicks so that they throw you a bit of money so that you can buy a microscope, then maybe. The thing is that focusing on PhD sucks away a great amount of energy you could spend on actually good things, consider that instead of actually programming less retarded software you will just have to do slavery for your dissertation advisor, do bureaucracy, p-value hacking, make powerpoint presentations, marketing for your research, give handjobs to sponsors, do bullshit research you dislike (because publish or perish), all while withstanding incredible amounts of stress and dodging depression. Really masters_degree is enough to give you all you need for a rich intellectual life and being able to do good things, and it won't suck the soul out of your body. On the best universities even bachelor's is probably enough. If you REALLY wanna be the smartass guy who others ought to call a doctor, in some countries you may get some kinda small doctorate, usually just for an extra exam and paying some fee (e.g. RNDr, PHDr etc., details will depend on your country so check that out). { TFW just getting the EZ dentist degree so that you may call yourself a doctor :D ~drummyfish } Nowadays you can also just buy a honorary doctorate online, it's absolutely legal business, though you probably don't wanna support this kind of capitalist bullshit, you just pay them unholy money for a piece of paper.

TODO


physics_engine

Physics Engine

{ LRS now has a very small 3D physics engine called tinyphysicsengine. ~drummyfish }

Physics engine is a software (usually a library) whose purpose is to simulate physics laws of mechanics, i.e. things such as forces, rigid and soft body collisions, particle motion, fluid dynamics etc.

{ When it comes to classic 3D rigid body physics engines, they're extremely hard to make, much harder than for example an advanced 3D rendering engine, especially when you want to make them LRS (without floating point, ...) and/or general and somewhat physically correct (being able to simulate e.g. the Dzhanibekov effect, satisfying all the conservation laws, continuous collision detection etc.). Good knowledge of mechanics and things like quaternions and 3D rotations is just the beginning, difficulties arise in every aspect of the engine, and of those there are many. As I've found, 32 bit fixed point is not enough for a general engine (even though it is enough for a rendering engine), you'll run into precision problems as you need to represent both relatively high and low energies. You'll also run into stability issues such as stable contacts, situations with multiple objects stacked on top of each other starting to bounce on their own etc. Even things such as deciding in what order to resolve collisions are very difficult, they can lead to many bugs such as a car not being able to drive on a straight road made of several segments. Collision detection alone for all combinations of basic shapes (sphere, cuboid, cylinder, capsule, ... let alone general triangle mesh) are hard as you want to detect general cases (not only e.g. surface collisions) and you want to extract all the parameters of the collisions (collision location, depth, normal etc.) AND you want to make it fast. And of course you'll want to add acceleration structures and many other thing on top. So think twice before deciding to write your own physics engine.

A sane approach may be to write a simplified engine specifically for your program, for example a Minetest-like game may just need non-rotating capsules in a voxel environment, that's not that hard. You can also get away with a bit of cheating and faking, e.g. simulating rigid bodies as really stiff soft bodies, it may not be as efficient and precise but it's simpler to program. It may be good enough. Well, that's basically what tinyphysicsengine does anyway. Old playstation game Rally Cross apparently did something similar too. ~drummyfish }

Physics engine is a wide term even though one usually imagines the traditional 3D rigid body engine used in games such as GTA. These engines may nevertheless have different purposes, features and even basic paradigms, some may e.g. be specialized just for computing precise ballistic trajectories for the army, some may serve for simulating weather etc. Some common classifications and possible characteristics of physics engines follow:

A typical physics engine will work something like this: we create a so called physics world, a data structure that represents the space in which the simulation takes place (it is similar to a scene in rendering engines). We then populate this world with physics elements such as rigid bodies (which can have attributes such as mass, elasticity etc.). These bodies are normally basic geometric shapes such as spheres, cylinders, boxes or capsules, or objects composed of several such basic shapes. This is unlike with rendering engines in which we normally have triangle meshes -- in physics engines triangle meshes are extremely slow to process, so for the sake of a physics engine we approximate this mesh with some of the above basic shapes (for example a creature in a game that's rendered as a hi-poly 3D model may in the physics engine be represented just as a simple sphere). Furthermore the bodies can be static (cannot move, this is sometimes done by setting their mass to infinity) or dynamic (can move); static bodies normally represent the environment (e.g. the game level), dynamic ones the entities in it (player, NPCs, projectiles, ...). Making a body static has performance benefits as its movement doesn't have to be calculated and the engine can also precalculate some things for it that will make e.g. collision detections faster. We then simulate the physics of the world in so called ticks (similar to frames in rendering); in simple cases one tick can be equivalent to one rendering frame, but properly it shouldn't be so (physics shouldn't be affected by the rendering speed, and also for the physics simulation we can usually get away with smaller "FPS" than for rendering, saving some performance). Usually one tick has set some constant time length (e.g. 1/60th of a second). In each tick the engine performs a collision detection, i.e. it finds out which bodies are touching or penetrating other bodies (this is accelerated with things such as bounding spheres). Then it performs so called collision resolution, i.e. updating the positions, velocities and forces so that the bodies no longer collide and react to these collisions as they would in the real world (e.g. a ball will bounce after hitting the floor). There can be many more things, for example constraints: we may e.g. say that one body must never get further away from another body than 10 meters (imagine it's tied to it by a rope) and the engine will try to make it so that this always holds. The engine will also offer a number of other functions such as casting rays and calculating where it hits (obviously useful for shooter games).

Integrating physics with graphics: you will most likely use some kind of graphics engine along with physics engine, even if just for debugging. As said above, keep in mind a graphics and physics engines should be strictly separated (decoupled, for a number of reasons such as reusability, easier debugging, being able to switch graphics and physics engines etc.), even though they closely interact and may affect each other in their design, e.g. by the data structures you choose for your program (voxel graphics will imply voxel physics etc.). In your program you will have a physics world and a graphics scene, both contain their own elements: the scene has graphics elements such as 3D models or particle systems, the physics world has elements such as rigid bodies and force fields. Some of the graphical and physics entities are connected, for example a 3D model of a tree may be connected to a physics rigid body of a cone shape. NOT ALL graphics elements have counterparts in the physics simulation (e.g. a smoke effect or light aren't present in the physics simulation) and vice versa (e.g. player in a first-person game has no 3D model but still has some physics shape). The connection between graphics and physics elements should be done above both engines (i.e. do NOT add pointers to physics object to graphics elements etc.). This means that e.g. in a game you create a higher abstract environment -- for example a level -- which stands above the graphics scene and physics world and has its own game elements, each game element may be connected to a graphics or physics element. These game elements have attributes such as a position which gets updated according to the physics engine and which is transferred to the graphics elements for rendering. Furthermore remember that graphics and physics should often run on different "FPS": graphics engines normally try to render as fast as they can, i.e. reach the highest FPS, while physics engines often have a time step, called a tick, of fixed time length (e.g. 1/30th of a second) -- this is so that they stay deterministic, accurate and also because physics may also run on much lower FPS without the user noticing (interpolation can be used in the graphics engine to smooth out the physics animation for rendering). "Modern" engines often implement graphics and physics in separate threads, however this is not suckless, in most cases we recommend the KISS approach of a single thread (in the main loop keep a timer for when the next physics tick should be simulated).

Existing Engines

One of the best and most famous FOSS 3D physics engines is Bullet (zlib license), it has many features (rigid and soft bodies, GPU acceleration, constraints, ...) and has been used in many projects (Blender, Godot, ...). Box2D is a famous 2D physics engine under MIT license, written in C++. Tinyphysicsengine is a KISS LRS 3D physics engine made by drummyfish.


pi

Pi

Pi is one of the most important and famous numbers, equal to approximately 3.14, most popularly defined as the ratio of a circle's circumference to its diameter (but also definable in other ways). It is one of the most fundamental mathematical constants of our universe and appears extremely commonly in mathematics, nature and, of course, programming. When written down in traditional decimal system, its digits go on and on without end and show no repetition or simple pattern, appearing "random" and chaotic -- as of 2021 pi has been evaluated by computers to 62831853071796 digits. In significance and properties pi is similar to another famous number: e.

Pi is a real transcendental number, i.e. simply put it cannot be defined by a "simple" equation (it is not a root of any polynomial equation). As a transcendental number it is also an irrational number, i.e. it cannot be written as an integer fraction. Mathematicians nowadays define pi via the period of the exponential function rather than geometry of circles. If we stick to circles, it is interesting that in non-Euclidean geometry the value of "pi" could be measured to different values (if we draw a circle on an equator of a ball, its circumference is just twice its diameter, i.e. "pi" would be measured to be just 2, reveling the curvature of space).

Pi to 100 decimal digits is:

3.1415926535897932384626433832795028841971693993751058209749445923078164062862089986280348253421170679...

Pi to 100 binary fractional digits is:

11.001001000011111101101010100010001000010110100011000010001101001100010011000110011000101000101110000...

Some people memorize digits of pi for fun and competition, the world record as of 2022 is 70030 memorized digits. Some people make mnemonics for remembering the digits of pi (this is known as PiPhilology), for example "Now I fuck a pussy screaming in orgasm" is a sentence that helps remember the first 8 digits (number of letters in each word encodes the digit).

PI IS NOT INFINITE. Soyence popularizators and nubs often say shit like "OH LOOK pi is so special because it infiniiiiiite". Pi is completely finite with an exact value that's not even greater than 4, what's infinite is just its expansion in decimal (or similar) numeral system, however this is nothing special, even numbers such as 1/3 have infinite decimal expansion -- yes, pi is more interesting because its decimal digits are non-repeating and appear chaotic, but that's nothing special either, there are infinitely many numbers with the same properties and mysteries in this sense (most famously the number e but besides it an infinity of other no-name numbers). The fact we get an infinitely many digits in expansion of pi is given by the fact that we're simply using a system of writing numbers that is made to handle integers and simple fractions -- once we try to write an unusual number with our system, our algorithm simply ends up stuck in an infinite loop. We can create systems of writing numbers in which pi has a finite expansion (e.g. base pi), in fact we can already write pi with a single symbol: pi. So yes, pi digits are interesting, but they are NOT what makes pi special among other numbers.

Additionally contrary to what's sometimes claimed it is also unproven (though believed to be true), whether pi in its digits contains all possible finite strings -- note that the fact that the series of digits is infinite doesn't alone guarantee this (as e.g. the infinite series 010011000111... doesn't contain any possible combinations of 1s and 0s either). This would hold if pi was normal -- then pi's digits would contain e.g. every book that will ever be written (see also Library Of Babel). But again, there are many other such numbers.

What makes pi special then? Well, mostly its significance as one of the most fundamental constants that seems to appear extremely commonly in math and nature, it seems to stand very close to the root of description of our universe -- not only does pi show that circles are embedded everywhere in nature, even in very abstract ways, but we find it in Euler's identity, one of the most important equations, it is related to complex exponential and so to Fourier transform, waves, oscillation, trigonometry (sin, cos, ...) and angles (radians use pi), it even starts appearing in number theory, e.g. the probability of two numbers being relative primes is 6/(pi^2), and so on.

Approximations And Programming

Evaluating many digits of pi is mathematically interesting, programs for computing pi are sometimes used as CPU benchmarks. There are programs that can search for a position of arbitrary string encoded in pi's digits. However in practical computations we can easily get away with pi approximated to just a few decimal digits, you will NEVER need more than 20 decimal digits, not even for space flights (NASA said they use 15 places).

An ugly engineering approximation that's actually usable sometimes (e.g. for fast rough estimates with integer-only hardware) is just (infamously almost made the legal value of pi by the so called Indiana bill in 1897)

pi ~= 3

A simple fractional approximation (correct to 6 decimal fractional digits) is

pi ~= 355/113

Such a fraction can again be used even without floating point -- let's say we want to multiply number 123 by pi, then we can use the above fraction and compute 355/113 * 123 = (355 * 123) / 113.

Leibnitz formula for pi is an infinite series that converges to the value of pi, however it converges very slowly { Quickly checked, after adding million terms it was accurate to 5 decimal fractional places. ~drummyfish }. It goes as

pi = 4 - 4/3 + 4/5 - 4/7 + 4/9 - 4/11 + ...

Nilakantha series converges much more quickly { After adding only 1000 terms the result was correct to 9 decimal fractional places for me. ~drummyfish }. It goes as

pi = 3 + 4/(2 * 3 * 4) + 4/(4 * 5 * 6) + 4/(6 * 7 * 8) + ...

A simple algorithm for computing approximate pi value can be based on approach used in further history: approximating a circle with many-sided regular polygon and then computing the ratio of its circumference to diameter -- as a diameter here we can take the average of the "big" and "small" diameter of the polygon. For example if we use a simple square as the polygon, we get pi ~= 3.31 -- this is not very accurate but we'll get a much higher accuracy as we increase the number of sides of the polygon. In 15th century pi was computed to 16 decimal digits with this method. Using inscribed and circumscribed polygons we can use this to get lower and upper bounds on the value of pi.

Another simple approach is monte carlo estimation of the area of a unit circle -- by generating random (or even regularly spaced) 2D points (samples) with coordinates in the range from -1 to 1 and seeing what portion of them falls inside the circle we can estimate the value of pi as pi = 4 * x/N where x is the number of points that fall in the circle and N the total number of generated points.

Spigot algorithm can be used for computing digits of pi one by one, without floating point. Bailey-Borwein-Plouffe formula (discovered in 1995) interestingly allows computing Nth hexadecimal (or binary) digit of pi, WITHOUT having to compute previous digits (and in a time faster than such computation would take). In 2022 Plouffe discovered a similar formula for computing Nth decimal digit.

The following is a C implementation of the Spigot algorithm for calculating digits of pi one by one that doesn't need floating point or special arbitrary length data types, adapted from the original 1995 paper. It works on the principle of converting pi to the decimal base from a special mixed radix base 1/3, 2/5, 3/7, 4/9, ... in which pi is expressed just as 2.22222... { For copyright clarity, this is NOT a web copy paste, it's been written by me according to the paper. ~drummyfish }

#include <stdio.h>

#define DIGITS 1000
#define ARRAY_LEN ((10 * DIGITS) / 3)

unsigned int pi[ARRAY_LEN];

void writeDigit(unsigned int digit)
{
  putchar('0' + digit);
}

int main(void)
{
  unsigned int carry, digit = 0, queue = 0;

  for (unsigned int i = 0; i < ARRAY_LEN; ++i)
    pi[i] = 2; // initially pi in this base is just 2s

  for (unsigned int i = 0; i < DIGITS; ++i)
  {
    carry = 0;

    for (int j = ARRAY_LEN - 1; j >= 0; --j)
    { // convert to base 10 and multiply by 10 (shift one to left)

      unsigned int divisor = (j + 1) * 2 - 1; // mixed radix denom.

      pi[j] = 10 * pi[j] + (j + 1) * carry;
      carry = pi[j] / divisor;
      pi[j] %= divisor;
    }

    pi[0] = carry % 10;
    carry /= 10;

    switch (carry)
    { // latter digits may influence earlier digits, hence these buffers
      case 9: // remember consecutive 9s
        queue++;
        break;

      case 10: // ..X 99999.. becomes ..X+1 00000...
        writeDigit(digit + 1);

        for (unsigned int k = 1; k <= queue; ++k)
          writeDigit(0);

        queue = 0;
        digit = 0;
        break;

      default: // normal digit, just print
        if (i != 0) // skip the first 0
          writeDigit(digit);

        if (i == 1) // write the decimal point after 1st digit
          putchar('.');

        digit = carry;

        for (unsigned int k = 1; k <= queue; ++k)
          writeDigit(9);

        queue = 0;
        break;
    }
  }

  writeDigit(digit); // write the last one

  return 0;
}

piracy

Piracy

Piracy is a capitalist propaganda term for the act of illegally sharing copyrighted information such as non-free books, movies, music, video games or scientific papers.

It is greatly admirable to support piracy, however also keep in mind the following: if you pirate a proprietary piece of information, you get it gratis but it stays proprietary, it abuses you, it limits your freedom -- you won't get the source code, you won't be able to publicly host it, you won't be allowed to create and share derivative works etc. Therefore prefer free (as in freedom) alternatives to piracy, i.e. pieces of information that are not only gratis but also freedom supporting.


plan9

Plan 9

Plan 9 (from Bell Labs, reference to the movie Plan 9 from Outer Space) is a research operating system, now FOSS, that was started by many of the original Unix developers as the next project of this kind, i.e. kind of a "new and updated Unix". It works with Unix philosophy (minimalist software philosophy) but tries to expand and modify it so as to fit "new/evolved" computers -- though Plan 9 developers claim the system is "more Unix than Unix itself", the validity of such claim is questionable as Plan 9 brings in a more complicated paradigm of distributed computing, dependencies (such as requiring GUI and mouse) and therefore bloat (though still being super minimal compared to mainstream operating systems). Besides the original Plan 9, which is apparently dead now, there exist active forks such as 9front.

On one hand Plan 9 sounds good and its idealism is admirable, nevertheless Plan 9 is SHIT due to the following fact: it requires what isn't necessary, for example GUI, mouse, file system and networking and forces computers and users to be certain way. This is absolutely unforgivable and violates the basic premise of good, freedom offering, minimalist nondiscriminatory software; in fact it violates the Unix philosophy which it is supposed to be building on top of -- an operating system should do one thing well: that of offering and environment for programs and their resources, user interface is a nontrivial extra task that should be separated. If you ask how to use Plan 9 without a mouse, the fans respond with telling you how stupid you are for not wanting to use mouse ("here is a study that says mice are better than keyboards: checkmate!") and that using mouse is actually what you want (hey bro, everyone's using a mouse, just accept it) -- they try to force a specific way of how computers should be and how they should be operated, just as Microsoft and Apple, without taking into account that computers can (and should be allowed to) be wildly different, very small, with tiny displays (or no displays at all), with no pointing devices (game consoles, voice operated computers, ...) etc. Sure, it may be possible to make the system work without a mouse or GUI, but these concepts form the very basis of the code and its philosophy, they will be carried as a dead weight if you're not using them and you will probably encounter great issues such as many programs simply relying on the existence of GUI and mouse and not working without them. The philosophy is similar to that of "smart" devices which assume that "Internet is everywhere" and so "let's put Internet into everything", even things that don't need any Internet at all (like hammers and teaspoons), and by the way they will no longer work without Internet (let's hope it doesn't go down lol). In this way Plan 9 is a dictatorship and we don't approve of it.

{ To plan 9 fans: please let me know if I misunderstand the concepts somehow, but this is how I understand the system. Beware however that trying to convince me to simply conform with your way of computing will lead nowhere. ~drummyfish }

Plan9's mascot, Glenda, is proprietary (as of february 2023), despite it having been uploaded to Wikimedia Commons lol. No license to be seen on its website.

TODO: some more shite like history and the actual basic concepts?


plusnigger

+NIGGER

+NIGGER is a license modifier that's meant to be added to a free software license to prevent corporations from adopting this software by making it impossible to make politically correct forks of such software. Its text is available at https://plusnigger.autism.exposed/.

The modifier adds a condition that all modified version of this software have to contain the word "NIGGER". For example a license GPLv3+NIGGER has all the conditions of a GPLv3 license plus the condition of including the word "NIGGER".


pokitto

Pokitto

Pokitto is a very nice educational open gaming console friendly to hacking and FOSS. It is also very family friendly, aiming to be used as an educational device for kids on schools, which doesn't at all take away any of its value for hardcore hackers. Its website is https://www.pokitto.com/.

Its great advantage is its great active and friendly community that's constantly writing software, documenting Pokitto and helping newcomers.

The console was created by Jonne Valola from Finland. He started the project on Kickstarter on April 28 2017, pledged over $27000 and released Pokitto in February 2018. { Jonne is a really nice guy who loves the project, puts his soul into the project and always personally helps people and shares technical details of the console. ~drummyfish }

Pokito, unlike most other open consoles, is NOT based on Arduino, but on NXP's LPC11U6x microcontroller (MCU). Some features and specs of Pokitto are:

How free is Pokitto? Quite freedom friendly, but not nearly 100% free; It is made out of proprietary hardware, but it's quite KISS, the Pokitto library, emulator and most tools as well as many games are FOSS, however the library contains a few proprietary pieces of code (short vendor source code without license), though these are almost certainly not harmful and could easily be replaced. Schematics and printable STL files are available, though license seems to be non-present. No Pokitto trademarks were surprisingly found during brief search.

Downsides of Pokitto are that the community is an open source community rather than free software one, purists like us will find they lean towards bloated solutions even though the technical limitation of the console largely prevent their implementation. The web forums runs on discourse and requires JavaScript for interactivity. Discord is also actively used for communication, even though some community members bridged it to free alternatives. The official library is relatively bloated and even contains some small pieces of unlicensed code from the MCU manufacturer -- they are very simple assembly snippets that may be easily replacaeble, but we should be cautious even about this. Anyway, a reasonably dedicated programmer might create a suckless Pokitto library without greater problems.


political_correctness

Political Correctness

The issue is not my language but your ego.

Political correctness (abbreviated PC) means pseudoleftist censorship and propaganda forced into language, science, art and generally all of culture, officially justified as "protecting people from getting offended". This does an immense harm to society as it is an artificially invented "issue" that not only puts people and science under heavy control, surveillance, censorship and threat of punishment, normalizing such practice, but also destroys culture, freedom of art and research and creates a great conflict between those who conform and those who value truth, freedom of art, science and communication, not talking about burdening the whole society with yet another competitive bullshit that doesn't have to exist at all. Political correctness is mainly a political tool that allows elimination (so called cancelling) and discrediting opposition of pseudoleftist political movements and parties, as well as brainwashing and thought control (see e.g. Newspeak).







Example of politically correct ASCII art. Note the absence of any content that might offend someone. Still the art is imperfect because it has a white background which might be seen as racially offensive.

The whole idea is basically about pushing the mentality of seeing certain words, pictures and other objects as inherently "offensive" to specific selected minorities (currently mostly women, gay, negros and other non-white races, trannies and fat and disabled people), even outside any context, and about constantly fabricating new reasons to get offended so as to fuel the movement. For example the word black box is declared as "offensive" to black people because... well, like, black people were discriminated at some point in history and their skin is black... so... the word black now can't be said? :D WTF. A sane mind won't understand this because we're dealing with a literal extremist cult here. It just keeps getting more ridiculous, for example feminists want to remove all words that contain the substring "man" from the language because... it's a male oppression? lol... anyway, we can no longer use words like snowman, now we have to say snowperson or something :D Public material now does best if it doesn't walk on the thin ice of showing people with real skin color and better utilize a neutral blue people :D Fuck just kill me already lmao.

Political correctness goes strictly against free speech, it tries to force people "to behave" and be afraid of words and talking, it creates conflict, divides society and also TEACHES people to be offended by language -- i.e. even if a specific word wouldn't normally be used or seen in a hostile way (e.g. the master branch in git repositories), political correctness establishes that NOW IT IS OFFENSIVE and specific minorities SHOULD take offense, even if they normally wouldn't, supporting offended culture and fight culture. I.e. political correctness can be called a cancer of society. LRS must never adhere to political correctness!

Of course, political correctness doesn't stop at censoring simple words, don't get mistaken. Facts in textbooks and encyclopedias such as those regarding race and sex differences are censored and replaced with lies with the help of soyence. Political correctness tries to forcefully dictate standards of a culture by an extremely rapidly changing fashion, e.g. the standard of beauty, politeness and so on -- last week we celebrated the international gender fluid day but THIS WEEK we celebrate fat disabled women with acne issues, all TV ads must have at least one crippled landwhale or else you're cancelled. If you can't keep up with their latest inventions you'll be executed -- on no, you used the term "mentally ill"! HOW DARE YOU THAT'S SO OFFENSIVELY AGGRESSIVE YOU HAVE TO SAY NEURODIVERGENT, you're basically Hitler now (but wait until next week when the word neuro itself becomes offensive).

OK, let's get back to a bit more serious. Just for the autistic neuroretarded people persons that might misunderstand our stance on social equality: LRS is for complete social equality of all people and eventually all living beings, however political correctness has nothing to do with achieving this goal, in fact it mostly goes against it, it creates huge amount of collateral damage, it divides people and fuels social conflict rather than calm it. We try to not cure symptoms of a shit society by harmful means but rather address the root cause by transitioning to a good society without conflict where there is no need for censorship, fact distortion and brainwashing to prevent discrimination. In the society we envision accepting facts about physical inequality does not imply an attack or discrimination at all as humans don't compete by their abilities, in such society the idea of political correctness is as ridiculous as e.g. arguing we should be creating numerically more inclusive datasets with higher leading digits as by Benford's law smaller digits are are a statistical majority that oppresses higher digits.


portability

Portability

Portable software is software that is easy to port to (make run on) other platforms. Platforms here mean anything that serves as an environment enabling software to run, i.e. hardware platforms (CPUs, ISAs, game consoles, ...), different operating systems vs bare metal, fantasy consoles etc. Portability is an extremely important attribute of good software as it allows us to write the program once and then run it on many different computers with little effort -- without portability we'd be constantly busy rewriting old programs to run on new computers, portability allows us to free our programs from being tied to specific computers and exist abstractly and independently and so become future proof. Examples of highly portable programs include Anarch, Simon Tatham's Portable Puzzle Collection, sbase (suckless) implementation of Unix tools such as cat and cmp etc.

Portability is different from mere multiplatformness: multiplatform software simply runs on more than one platform without necessarily being designed with high portability in mind; portable software on the other hand possesses the inherent attribute of being designed so that very little effort is required to make it run on wide range of general platforms. Multiplatformness can be achieved cheaply by using a bloated framework such as the Godot engine or QT framework, however that will not achieve portability; on the contrary it will hurt portability. Portability is achieved through good and careful design, efficient code and avoiding dependencies and bloat.

In connection to software the word portable also has one other meaning used mainly in context of Windows programs: it is sometimes used for a binary executable program that can be run without installing (i.e. it can be carried around and ran from a USB drive etc.). However we'll stick to the previously defined meaning.

How To Make Portable Programs

In short: use abstraction to not get tied to any specific platform (separate frontend and backend), keep it simple, minimize dependencies (minimize use of libraries and requiring hardware such as floating point unit or a GPU, have fallbacks), write efficient, simple code (lower hardware demands will support more platforms), avoid platform-specific features (don't write in assembly as that's specific to each CPU, don't directly use Linux syscalls as these are specific to Linux etc.).

Remember, portability is about making it easy for a programmer to take your program and make it run elsewhere, so portability is kind of a mindset, it is about constantly putting oneself in the shoes of someone else with a very different computer and asking questions such as "how hard will it be to make this work if this library isn't available?". Even things that are supposed or commonly expected to be present on all platforms, such as a file system or a raster screen, may not be present on some computers -- always remember this.

Do NOT use big frameworks/engines -- it is one of the greatest misconceptions among many inexperienced programmers to think portable software is created with big frameworks, such as the Godot engine or the QT framework, which can "single click" export/deploy software to different platforms. This will merely achieve creating a badly bloated multiplatform program that's completely dependent on the framework itself which drags along hundreds of dependencies and wastes computing resources (RAM, CPU, storage, ...) which are all factors directly contradicting portability. If you for example create a snake game in Godot, you won't be able to port it to embedded devices or devices without an operating system even though the snake game itself is simple enough to run on such devices -- the game drags along the whole Godot engine which is so huge, complex and hardware demanding that it prevents the simple game from running on simple hardware.

The same goes for languages and libraries: do NOT use big/bloated languages such as Python, Java or JavaScript -- your program would immediately become dependent on a hugely complex ecosystem of such language. For portability you should basically only write in C (the best established, time tested, relatively simple language supported basically by every platform) or in C++ at worst, and even with these languages do NOT use the newer standards as these hugely limit the number of compliant compilers that will be able to compile your program. The best is to write in C89 or C99 standard of C. Minimize the number of libraries you use, even if it is the standard library of your language -- not all compilers fully adhere to standards and some don't have the standard library even if the should.

Always make your own thin I/O abstraction, decouple your I/O libraries, separate frontend and backend. This is one of the most basic and most important things to do. Why? Well unless you're writing a library, you will need to use I/O (write out messages, draw to screen, create GUI, read keyboard commands, read from files, read from network, ...) so you will NEED to use some library for this (C stdlib, SDL, OS syscalls, Xlib, ...) but you absolutely DON'T WANT this library to become a hard dependency of your program because if your program depends let's say on SDL, you won't be able to make your program run on platforms that don't have SDL. So the situation is that you HAVE TO use some I/O library but you don't want to become dependent on it.

The way to solve this is to create your own small I/O abstraction in your project, i.e. your own functions (such as drawPixel, writeMessage, keyPressed, playSound, readFrile etc.) for performing I/O, which you will use inside your main program. These functions will be defined in a small file which will basically be your own small I/O library just for your program. The functions you define there will then internally use functions of whatever underlying I/O system you choose to use at the time as your frontend (SDL, Xlib, SFML, ...); the important thing is that your main program code won't itself depend on the underlying system, it will only depend on your I/O abstraction, your own functions. Your custom I/O functions will depend on the underlying I/O system but in a way that's very easy to change -- let's say that your keyPressed function internally uses SDL's SDL_GetKeyboardState to read keyboard state. If you want to switch from using SDL to using a different frontend, you will only have to change the code in one place: in your I/O abstraction code, i.e. inside your keyPressed function. E.g. if you switch from SDL to SFML, you will just delete the code inside your keyPressed function and put in another code that uses SFML functions to read keyboard (e.g. the isKeyPressed attribute), and your whole code will instantly just work on SFML. In fact you can have multiple implementations of your functions and allow switching of different backends freely -- just as it is possible to compile a C program with any C compiler, you can make it possible to compile your program with any I/O frontend. If you used SDL's specific functions in your main code, you would have to completely rewrite your whole codebase if you wanted to switch away from SDL -- for this reason your main code must never directly touch the underlying I/O system, it must only do so through your I/O abstraction. Of course these principles may apply to any other thing that requires use of external libraries, not just I/O.

This is all demonstrated by LRS programs such as Anarch or SAF, you can take a look at their code to see how it all works.

Anyway the following is a simple C code to demonstrate the abstraction from an I/O system -- it draws a dithered rectangle to the screen and waits until the user pressed the q key, then ends. The main code is written independently of any I/O system and can use either C stdlib (stdio, draws the rectangle to terminal with ASCII characters) or SDL2 (draws the rectangle to actual window) as its frontend -- of course more frontends (e.g. one using Xlib or SFML) can be added easily, this is left as an exercise :)

#define SCREEN_W 80
#define SCREEN_H 30

// our I/O abstraction:
void ioInit(void);      // init our I/O
void ioEnd(void);       // destroy our I/O
void drawPixel(int x, int y, int white);
void showImage(void);
int  isKeyPressed(char key);

// our main program code:
int main(void)
{
  ioInit();

  for (int y = 3; y < 20; ++y) // draw dithered rectangle
    for (int x = 30; x < 60; ++x)
      drawPixel(x,y,x % 2 == y % 2);

  showImage();

  while (!isKeyPressed('q')); // wait for pressing 'q'

  ioEnd();

  return 0;
}

/*---------------------------------------------------
  implementation of our I/O abstraction for different
  frontends: */

#ifdef FRONTEND_STDLIB // C stdio terminal frontend
  #include <stdio.h>
  char screen[SCREEN_W * SCREEN_H];

  void ioInit(void)
  {
    // clear screen:
    for (int i = 0; i < SCREEN_W * SCREEN_H; ++i)
      screen[i] = 0; 
  }

  void ioEnd(void) { } // nothing needed here

  void drawPixel(int x, int y, int white)
  {
    screen[y * SCREEN_W + x] = white != 0;
  }

  void showImage(void)
  {
    for (int i = 0; i < SCREEN_W * SCREEN_H; ++i)
    {
      if (i % SCREEN_W == 0)
        putchar('\n');

      putchar(screen[i] ? '#' : '.');
    }

    putchar('\n');
  }

  int isKeyPressed(char key)
  {
    return getchar() == key;
  }
#elif defined(FRONTEND_SDL) // SDL2 frontend
  #include <SDL2/SDL.h>
  unsigned char screen[SCREEN_W * SCREEN_H];
  SDL_Window *window;
  SDL_Renderer *renderer;
  SDL_Texture *texture;

  void ioInit(void)
  {
    for (int i = 0; i < SCREEN_W * SCREEN_H; ++i)
      screen[i] = 0; 
    
    SDL_Init(0);
  
    window = SDL_CreateWindow("sdl",SDL_WINDOWPOS_UNDEFINED,
      SDL_WINDOWPOS_UNDEFINED,SCREEN_W,SCREEN_H,SDL_WINDOW_SHOWN); 

    renderer = SDL_CreateRenderer(window,-1,0);

    texture = SDL_CreateTexture(renderer,SDL_PIXELFORMAT_RGB332,
      SDL_TEXTUREACCESS_STATIC,SCREEN_W,SCREEN_H);
  }

  void ioEnd(void)
  {
    SDL_DestroyTexture(texture);
    SDL_DestroyRenderer(renderer); 
    SDL_DestroyWindow(window); 
  }

  void drawPixel(int x, int y, int white)
  {
    screen[y * SCREEN_W + x] = (white != 0) * 255;
  }

  void showImage(void)
  {
    SDL_UpdateTexture(texture,NULL,screen,SCREEN_W);

    SDL_RenderClear(renderer);
    SDL_RenderCopy(renderer,texture,NULL,NULL);
    SDL_RenderPresent(renderer);
  }

  int isKeyPressed(char key)
  {
    SDL_PumpEvents();
    const unsigned char *keyboard = SDL_GetKeyboardState(NULL);

    return keyboard[SDL_SCANCODE_A + (key - 'a')];
  }
#endif

If you compile this code as

gcc -DFRONTEND_STDLIB main.c

You'll get the stdlib version. If you compile it as

gcc -DFRONTEND_SDL -lSDL2 main.c

You'll get the SDL version.


prime

Prime Number

Prime number (or just prime) is a whole positive number only divisible by 1 and itself, except for the number 1. I.e. prime numbers are 2, 3, 5, 7, 11, 13, 17 etc. Prime numbers are extremely important, interesting and mysterious for their properties and distribution among other numbers, they have for millennia fascinated mathematicians, nowadays they are studied in the math subfield called number theory. Primes are for example essential in assymetric cryptography. Primes can be seen as the opposite of highly composite numbers (also antiprimes, numbers that have more divisors than any lower number).

The largest known prime number as of 2022 is 2^82589933 - 1 (it is so called Mersenne prime, i.e. a prime of form 2^N - 1).

Every natural number greater than 1 has a unique prime factorization, i.e. a set of prime numbers whose product it is. For example 75 is a product of three primes: 3 * 5 * 5. This is called the fundamental theorem of arithmetic. Naturally, each prime has a factorization consisting of a single number -- itself -- while factorizations of non-primes consist of at least two primes. To mathematicians prime numbers are what chemical elements are to chemists -- a kind of basic building blocks.

Why is 1 not a prime? Out of convenience -- if 1 was a prime, the fundamental theorem of arithmetic would not hold because 75's factorization could be 3 * 5 * 5 but also 1 * 3 * 5 * 5, 1 * 1 * 3 * 5 * 5 etc.

The unique factorization can also nicely be used to encode multisets as numbers. We can assign each prime number its sequential number (2 is 0, 3 is 1, 5 is 2, 7 is 3 etc.), then any number encodes a set of numbers (i.e. just their presence, without specifying their order) in its factorization. E.g. 75 = 3 * 5 * 5 encodes a multiset {1, 2, 2}. This can be exploited in cool ways in some cyphers etc.

When in 1974 the Arecibo radio message was sent to space to carry a message for aliens, the resolution of the bitmap image it carried was chosen to be 73 x 23 pixels -- two primes. This was cleverly done so that when aliens receive the 1679 sequential values, there are only two possible ways to interpret them as a 2D bitmap image: 23 x 73 (incorrect) and 73 x 23 (correct). This increased the probability of correct interpretation against the case of sending an arbitrary resolution image.

There are infinitely many prime numbers. The proof is pretty simple (shown below), however it's pretty interesting that it has still not been proven whether there are infinitely many twin primes (primes that differ by 2), that seems to be an extremely difficult question.

Euklid's proof shows there are infinitely many primes, it is done by contradiction and goes as follows: suppose there are finitely many primes p1, p2, ... pn. Now let's consider a number s = p1 * p2 * ... * pn + 1. This means s - 1 is divisible by each prime p1, p2, ... pn, but s itself is not divisible by any of them (as it is just 1 greater than s and multiples of some number q greater than 1 have to be spaced by q, i.e. more than 1). If s isn't divisible by any of the considered primes, it itself has to be a prime. However that is in contradiction with the original assumption that p1, p2, ... pn are all existing primes. Therefore a finite list of primes cannot exist, there have to be infinitely many of them.

Distribution and occurrence of primes: the occurrence of primes seems kind of """random""" (kind of like digits of decimal representation of pi), without a simple pattern, however hints of patterns appear such as the Ulam spiral -- if we plot natural numbers in a square spiral and mark the primes, we can visually distinguish dimly appearing 45 degree diagonals as well as horizontal and vertical lines. Furthermore the density of primes decreases the further away we go from 0. The prime number theorem states that a number randomly chosen between 0 and N (for large N) has approximately 1/log(N) probability of being a prime. Prime counting function is a function which for N tells the number of primes smaller or equal to N. While there are 25 primes under 100 (25%), there are 9592 under 100000 (~9.5%) and only 50847534 under 1000000000 (~5%).

         ,   ,     ,    ',  '    ,'  , ,  '
      ',',          '      ,  ' ',  ' ',
,,   ,   ,'     '     ',   ,'      ,'  ,      ' '
 ,     ,    ', ,  '  ,' ', ,'             '  ,', ,
      '     ',',     ,   ,          ',    ',
,    , ,               ,'  ,' '   ',   , ,      '
     ,',   ,',  '  , ,     ,' '  ,  '        ,
, ',  ' ',     ,   ,'      ,    ',  '     ' '    ,
  ',', , ,'   '  ,     ,'   ' '  ,',' ',     , ,
            '     '   ' ', ,     ,   ,  ', ,  ', ,
      '   '   '   ' '   ', ,   ,'   ' '         ',
  ' ',  ', , , ,', ,',',   ,' ' '  ,  ' ' '  , ,
     ,             ,   ,',;,', , ,   , ,',     ,
  ' '       '       ',' ',
,,' '  ,  '  ,' ',  ','   ',  ' ', ,'  ,' '   ',
     , ,         , ,  '  ,   ,   ,'      , ,  '
      ' ',    ',  '  ,   , ,  '       '     '
,,       , , , ,    ' '  ,',    ',  '  , ,'  ,','
 ,     ,   ,',  '  ,     ,  ',',           ,     ,
  ',    ' '       ', ,   , ,'       ',    ',
,   ',       , ,'   ',    '     '      ,  ' '
 ,    '   '       '           '   ',    ',   ,
   ,  '   ' '   '    ,'    ,   , ,'             ',
 ,'     ',     ,   ,',   ,  '    ,     ,    '   '
     ,            ',     ,   , ,     ,   ,'   ',

Ulam spiral: the center of the image is the number 1, the number line continues counter clockwise, each point represents a prime.

Here are prime numbers under 1000: 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97, 101, 103, 107, 109, 113, 127, 131, 137, 139, 149, 151, 157, 163, 167, 173, 179, 181, 191, 193, 197, 199, 211, 223, 227, 229, 233, 239, 241, 251, 257, 263, 269, 271, 277, 281, 283, 293, 307, 311, 313, 317, 331, 337, 347, 349, 353, 359, 367, 373, 379, 383, 389, 397, 401, 409, 419, 421, 431, 433, 439, 443, 449, 457, 461, 463, 467, 479, 487, 491, 499, 503, 509, 521, 523, 541, 547, 557, 563, 569, 571, 577, 587, 593, 599, 601, 607, 613, 617, 619, 631, 641, 643, 647, 653, 659, 661, 673, 677, 683, 691, 701, 709, 719, 727, 733, 739, 743, 751, 757, 761, 769, 773, 787, 797, 809, 811, 821, 823, 827, 829, 839, 853, 857, 859, 863, 877, 881, 883, 887, 907, 911, 919, 929, 937, 941, 947, 953, 967, 971, 977, 983, 991, 997.

                                      ______/
                                     /     /
                        _____ ______/_    /
                 ____  /     X     /__\  /
          ___   /    \/__   / \   /__  \/
         /   \ /     /\  \ /   \ /   \ /\
--2--3--/--5--X--7--/--\--X--11-X--13-X--\--
   \__\/ \  \/ \__\/ \  \/ \__\/ \  \/ \__\/
       \__\_/\  \ /\  \ /\__\_/\  \ /\  \ /\
           \__\__X  \  X  \  X  \  X__\__X
               \__\__\/ \  \/ \  \/ \  \/ \
                   \__\__\_/\  \ /\  \ /\  \

There also exists a term pseudoprime -- it stands for a number which is not actually a prime but appears so because it passes some quick primality tests.

Algorithms

Primality test: testing whether a number is a prime is quite easy and not computationally difficult (unlike factoring the number). A naive algorithm is called trial division and it tests whether any number from 2 up to the tested number divides the tested number (if so, then the number is not a prime, otherwise it is). This can be optimized by only testing numbers up to the square root (including) of the tested number (if there is a factor greater than the square root, there is also another smaller than it which would already have been tested). A further simple optimization is to to test division by 2, 3 and then only numbers of the form 6q +- 1 (other forms are divisible by either 2 or 3, e.g 6q + 4 is always divisible by 2). Further optimizations exist and for maximum speed a look up table may be used for smaller primes. A simple C function for primality test may look e.g. like this:

int isPrime(int n)
{
  if (n < 4)
    return n > 1;
    
  if (n % 2 == 0 || n % 3 == 0)
    return 0;
    
  int test = 6;
  
  while (test <= n / 2) // replace n / 2 by sqrt(n) if available
  {
    if (n % (test + 1) == 0 || n % (test - 1) == 0)
      return 0;
      
    test *= 6;
  }
  
  return 1;
}

Sieve of Eratosthenes is a simple algorithm to find prime numbers up to a certain bound N. The idea of it is following: create a list of numbers up to N and then iteratively mark multiples of whole numbers as non-primes. At the end all remaining (non-marked) numbers are primes. If we need to find all primes under N, this algorithm is more efficient than testing each number under N for primality separately (we're making use of a kind of dynamic programming approach).

Prime factorization: We can factor a number by repeatedly brute force checking its divisibility by individual primes and there exist many algorithms applying various optimizations (wheel factorization, Dixon's factorization, ...), however for factoring large (hundreds of bits) primes there exists no known efficient algorithm, i.e. one that would run in polynomial time, and it is believed no such algorithm exists (see P vs NP). Many cryptographic algorithms, e.g. RSA, rely on factorization being inefficient. For quantum computers a polynomial ("fast") algorithm exists, it's called Shor's algorithm.

Prime generation: TODO

See Also


primitive_3d

Primitive 3D

See pseudo 3D.


privacy

Privacy

Digital privacy is the ability and freedom of an individual to hide "sensitive" information about himself; nowadays "privacy concerns" are a big part of capitalist fear culture. Of course, there are other forms of privacy than digital, for example the physical privacy in real life, however in this article we'll be implicitly dealing with digital privacy unless mentioned otherwise, i.e. privacy with respect to computers, e.g. on the Internet.

{ I have my personal data publicly online and under CC0 for anyone to download and do anything with, including my real name, date of birth, location where I live, medical info and even nude photos. Literally nothing bad ever happened due to this. ~drummyfish }

Society is becoming more and more obsessed with privacy and that is EXTREMELY BAD. It leads to hardcore censorship, people are hiding their email addresses so it's impossible to contact them, photos of child faces are wiped from the Internet, more and more videos on the internet now just blur everything in the video that's not the main focus of it, "just in case", people are even afraid to credit other people by name even if they are e.g. legally obliged to by a license such as CC-BY-SA (lmao https://forum.freegamedev.net/viewtopic.php?f=7&t=19322). Such retardedness has probably never been seen yet.

As of 2023 privacy is impossible to achieve unless you live in wilderness completely independently of the main "civilization". If you use any kind of computer (laptop, TV, phone, car, camera etc.), you are already being watched: basically all CPUs have proven hardware spyware in them capable of bypassing encryption, see Intel ME etc., no matter what operating system you use, and even if you use some obscure CPU without it, you are watched through your Internet activity (even if you use a "secure" browser, which you most likely don't even if you think you do), your browsing habits are watched and analyzed by highly advanced AI that can track you even without cookies etc., e.g. just from your writing style, patterns of repeated daily activity, mouse movement signature etc. -- all small fragments of information about your activity such as those mentioned above and your locations over time (known from your phone connecting to towers, someone else's phone detecting your voice, street or car camera detecting your face, credit card payments etc.) are connected with other fragments of information (even those of other people) and AI makes a complete picture of your life available to those who need it. You may think you're doing everything right and that they can't find you, but it's enough if e.g. someone from your family posted a picture with you on facebook 10 years ago or if you as a child played online games -- this is enough to know which people you are related to and them being tracked then leads to you also being tracked to a big degree despite you using 7 proxies and living underground. If the government furthermore decides to watch you more (which may happen just because you e.g. try to "protect" your privacy more and start using Tor, which is suspicious), they can just watch you in real time through satellites (even inside buildings) and so on. So you just have to accept you are being watched, and unless we end capitalism, it will only be getting worse (mind reading technology is already emerging).

We have to state that privacy concerns are a symptom of bad society. We shouldn't ultimately try to protect privacy more (cure symptoms) but rather make a society where need for privacy isn't an issue (cure the root cause). This sentiment is shared by many hackers, even Richard Stallman himself used to revolt against passwords when he was at MIT AI Labs; he intentionally used just the password "rms" to allow other people to use his account (this is mentioned in the book Free As In Freedom). Efforts towards increasing and protecting privacy is in its essence an unnecessary bullshit effort wasting human work, similarly to law, marketing etc. It is all about censorship and secrecy. Besides this, all effort towards protecting digital privacy will eventually fail, thanks to e.g. advanced AI that will identify individuals by pattern in their behavior, even if their explicit identity information is hidden perfectly. Things such as browser fingerprinting are already a standard and simple practice allowing highly successful uncovering of identity of anonymous people online, and research AI is taking this to the next level (e.g. the paper Detecting Individual Decision-Making Style: Exploring Behavioral Stylometry in Chess shows revealing chess players by their play style). With internet of stinks, cameras, microphones and smartphones everywhere, advanced AI will be able to identify and track an individual basically anywhere no matter the privacy precautions taken. Curing the root cause is the only option to prevent a catastrophic scenario.

By this viewpoint, LRS's stance towards privacy differs from that of many (if not most) free software, hacker and suckless communities: to us privacy is a form of censorship and as such is seen as inherently bad. We dream of a world without abuse where (digital) privacy is not needed because society has adopted our philosophy of information freedom, non-violence and non-competition and there is no threat of sensitive information abuse. Unlike some other people (so called pragmatics), not only do we dream of it, we actively try to make it a reality. Even though we know the ideally working society is unreachable, we try to at least get close to it by restricting ourselves to bare minimum privacy (so we are very open but won't e.g. publish our passwords). We believe that abuse of sensitive information is an issue of the basic principles of our society (e.g. capitalism) and should be addressed by fixing these issues rather than by harmful methods such as censorship.

Digital privacy can be further categorized. We can talk e.g. about communication privacy (emails, chat, ...), data privacy (cookies, tracking, medical data, ...), personal privacy (intimate photos, sexual orientation, ... ), individual privacy (identifying information, anonymity, spam, ...) etc.

Privacy is closely related to cryptography, as encryption is how information can be "protected" against reaching unauthorized entities, and to free software, as using safe tools with available source code is crucial to avoid malware. Still, to achieve high privacy additional appropriate behavior has to be adopted, e.g. protection against spyware, using proxies and/or onion routing, turning off browser cookies, avoiding fingerprinting, avoiding social networks, avoiding revealing potentially identifying information etc.


procgen

Procedural Generation

Procedural generation (procgen) refers to creation of data, such as art assets in games or test data for data processing software, by using algorithms and mathematical formulas rather than creating the data manually or measuring it in the real world (e.g. by taking photographs). This can be used for example for automatic generation of textures, texts, music, game levels or 3D models but also practically anything else, e.g. test databases, animations or even computer programs. Procedural art currently doesn't reach qualities and creativity of a skilled human artist, but it can be good enough or even necessary (e.g. for creating extremely large worlds), it may be preferred e.g. for its extreme save of storage memory, it can help add detail to human work, be a good filler, a substitute, an addition to or a basis for manually created art. Procedural generation has many advantages such as saving space (instead of large data we only store small code of the algorithm that generates it), saving time (once we have an algorithm we can generate a lot data extremely quickly), increasing resolution practically to infinity or extending data to more dimensions (e.g. 3D textures). Procedural generation can also be used as a helper and guidance, e.g. an artist may use a procedurally generated game level as a starting point and fine tune it manually, or vice versa, procedural algorithm may create a level by algorithmically assembling manually created building blocks.

As neural AI approaches human level of creativity, we may see computers actually replacing many artists in near future, however it is debatable whether AI generated content should be called procedural generation as AI models are quite different from the traditional hand-made algorithms -- AI art is still seen as a separate approach than procedural generation. For this we'll only be considering the traditional approach from now on.

Minecraft (or Minetest) is a popular example of a game in which the world is generated procedurally, which allows it to have near-infinite worlds -- size of such a world is in practice limited only by ranges of data types rather than available storage memory. Roguelikes also heavily utilize procgen. However this is nothing new, an old game Daggerfall was known for its extremely vast procedurally generated world. Some amount of procedural generation can be seen probably in most mainstream games, e.g. clouds, vegetation or NPCs are often made procedurally.

For its extreme save of space procedural generation is extremely popular in demoscene where programmers try to create as small programs as possible. German programmers made a full fledged 3D shooter called .kkrieger that fits into just 96 kB! It was thanks to heavy use of procedural generation for the whole game content. Bytebeat is a simple method of generating procedural "8bit" music, it is used e.g. in Anarch. Procedural generation is generally popular in indie game dev thanks to offering a way of generating huge amounts of content quickly and without having to pay artists.

We may see procgen as being similar to compression algorithms: we have large data and are looking for an algorithm that's much smaller while being able to reproduce the data (but here we normally go the other way around, we start with the algorithm and see what data it produces rather than searching for an algorithm that produces given data). John Carmack himself called procgen "basically a shitty compression".

Using fractals (e.g. those in a form of L-system) is a popular technique in procgen because fractals basically perfectly fit the definition perfectly: a fractal is defined by a simple equation or a set of a few rules that yield an infinitely complex shape. Nature is also full of fractals such as clouds, mountain or trees, so fractals look organic.

There are also other techniques such as wave function collapse which is used especially in tile map generation. Here we basically have some constraints set (such as which tiles can be neighbors) and then consider the initial map a superposition of all possible maps that satisfy these constraints -- we then set a random tile (chosen from those with lowest entropy, i.e. fewest possible options) to a random specific value and propagate the consequences of it to other tiles causing a cascading effect of collapsing the whole map into one of the possible solutions.

A good example to think of is generating procedural textures. This is generally done by first generating a basis image or multiple images, e.g. with noise functions such as Perlin noise (it gives us a grayscale image that looks a bit like clouds). We then further process this base image(s) and combine the results in various ways, for example we may use different transformations, modulations, blending, adding color using color ramps etc. The whole texture is therefore described by a graph in which nodes represent the operations we apply; this can literally be done visually in software like Blender (see its shader editor). The nice thing is that we can now for example generalize the texture to 3 dimensions, i.e. not only have a flat image, but have a whole volume of a texture that can extremely easily be mapped to 3D objects simply by intersecting it with their surfaces which will yield a completely smooth texturing without any seams; this is quite often used along with raytracing -- we can texture an object by simply taking the coordinates of the ray hit as the 3D texture coordinates, it's that simple. Or we can animate a 2D texture by doing a moving cross section of 3D texture. We can also write the algorithm so that the generated texture has no seams if repeated side-by-side (by using modular "wrap-around" coordinates). We can also generate the texture at any arbitrary resolution as we have a continuous mathematical description of it; we may perform an infinite zoom into it if we want. As if that's not enough, we can also generate almost infinitely many slightly different versions of this texture by simply changing the seed of pseudorandom generator we use.

We use procedural generation mainly in two ways:

Indeed we may also do something "in between", e.g. generate procedural assets into temporary files or RAM caches at run time and depending on the situation, for example when purely realtime generation of such assets would be too slow.

Example

TODO


productivity_cult

Productivity Cult

"PRODUCE PRODUCE PRODUCE PRODUCE PRODUCE" --capitalism

Productivity cult is one of modern capitalist religions which praises human productivity above everything, even happiness, well being, sanity etc. Kids nowadays are all about "how to be more motivated and productive", they make daily checklists, analyze tables of their weekly performance, count how much time they spend taking a shit on the toilet, give up sleep to study some useless bullshit required by the current market fluctuation. Productivity cult is all about voluntarily making oneself a robot, a slave to the system that worships capital. It sometimes hides beyond other terms such as self improvement, personal growth etc.

A human is living being, not a machine, he should live a happy relaxed life, dedicated to spending time with his close ones, raising children, enjoying the beauties of nature, exploring secrets of the universe, without stress; he should create when inspiration or a truly great necessity comes to him and when he does, he should take his time to carefully make the art great, without hasting it or forcing it. Productivity cult goes all against this, it proclaims one should be constantly spitting out "something", torturing and forcing himself, maximizing quantity on detriment of quality, undergo constant stress while suppressing rest -- that one should all the time be preoccupied with competitive fight, deadlines, that art he creates is something that can be planned on schedule, made on deadline and assigned a calculated price tag to be an ideal consumerist product. If such stance towards life doesn't make you wanna puke, you most likely lack a soul.

Do not produce. Create. Art takes time and can't be scheduled.

The name of the cult itself says a lot about it. While a name such as efficiency would probably be better, as efficiency means doing less work with the same result and therefore having more free time, it is not a surprise that capitalism has chosen the word productivity, i.e. producing more which means working more, e.g. for the price of free time and mental health.

Productivity obsessed people are mostly idiots without the ability to think for themselves, they have desktops with "motivational" wallpapers saying shit like "the word impossible doesn't exist in my dictionary" and when you tell them if it wouldn't be better to rather establish a society where people wouldn't have to work they start screeching "HAHAA THATS IMPOSSIBLE IT CANT WORK". Productivity maximalists bully people for taking rest or doing what they otherwise enjoy in moderation -- they invent words such as "procrastination" to create a feeling of ever present guilt induced by doing what one truly enjoys.

Productivity freaks are often the ones who despise consumers, i.e. brainless zombies that consume goods, but somehow don't seem to mind being producers, a similar kind of brainless zombies that just stands on the other end of this retarded system.

One of the funniest examples of productivity cult gone too far is so called "life couching" in which the aspiring producer robots hire bullshit cult leaders, so called "life couches", to shout at them to be more productive. At least in the past slaves were aware of being slaves and tried to free themselves. I literally want to kill myself.

Productivity is such a big deal because programmers are in fact actually getting exponentially less productive due to time needed to spend on bullshit nowadays, on overcomplicated buggy bloat and billions of frameworks needed to get basic things done -- this has been pointed out by Jonathan Blow in his talk Preventing the Collapse of Civilization in which he refers to the video of Ken Thompson talking about how he developed the Unix operating system in three weeks.

A considerable number of people are attracted to suckless software due to its positive effects on productivity thanks to the elimination of bullshit. These are mostly the kind of above mentioned dumbasses who just try to exploit anything they encounter for self interest without ever aiming for greater good, they don't care about Unix philosophy beyond its effects on increasing their salary. Beware of them, they poison society.

See Also


programming_language

Programming Language

Programming language is an artificial language created so that humans can relatively easily communicate algorithms to computers. Such language often tries to mimic human language (practically always English) but is much MUCH simpler so that a computer can actually analyze it and understand it precisely so it also partially looks like math expressions. A programming language can be seen as a middle ground between pure machine code (very hard to handle by humans) and natural language (very hard to handle by computers).

For beginners: a programming language is actually much easier to learn than a foreign language, it will typically have fewer than 100 "words" to learn (out of which you'll mostly use like 10) and once you know one programming language, learning another becomes a breeze because they're all (usually) pretty similar in basic concepts. The hard part may be learning some of the concepts.

A programming language is distinct from a general computer language by its purpose to express algorithms and be used for creation of programs. There are computer languages that are NOT programming languages (at least in the narrower sense), such as HTML, json and so on.

We divide programming languages into different groups. Perhaps the most common divisions is to two groups:

Sometimes the distinction here may not be completely clear, for example Python is normally considered an interpreted language but it can also be compiled into bytecode and even native code. Java is considered more of a compiled language but it doesn't compile to native code (it compiles to bytecode). C is traditionally a compiled language but there also exist C interpreters etc.

We can divide language in many more ways, for example based on their paradigm (impertaive, declarative, object-oriented, functional, logical, ...), purpose (general purpose, special purpose), computational power (turing complete or weaker), level of abstraction (high, low), typing (strong, weak, dynamic, static) or function evaluation (strict, lazy).

Nice Languages

{ THIS IS NOT A COMPREHENSIVE LIST, I can only include languages that I am familiar with, please add more ~drummyfish }

See Also


programming

Programming

Not to be confused with coding.

Programming is the act, science and art of writing computer programs; it involves creation of algorithms and data structures and implementing them in programming languages.

You may also encounter the term coding which is used by noob wannabe programmers, so called "coders" or code monkeys. "Coding" doesn't reach the quality of programming, it is done in baby handholding languages like Python, JavaScript or Rust by people with very shallow knowledge of technology and its context, barely qualified to turn on a computer (like jewtubers), who have flooded the computer industry since it became lucrative. What they do is not real programming. Do not try to imitate them.

At high level programming becomes spiritual. Check out e.g. the famous Tao of Programming (yes, it's kind of a joke but it's based on reality, programming can truly be kind of a meditation and pursuit of enlightenment). Many people say that learning programming opens your eyes in a certain new way, you then see the world like never before (but that's probably kind of true of almost all skills so this may be a shit statement). Others say too much programming cripples you mentally and gives you autism. Anyway it's fun. Programming requires a good knowledge of advanced math. Also probably at least above average IQ, as well as below average social intelligence. Being a man is an advantage.

How To Learn Programming And Do It Well

See also programming tips.

The key thing to becoming a programmer is learning a programming language very well (and learning many of them), however this is not enough (it's only enough for becoming a coding monkey), you additionally have to have a wider knowledge such as general knowledge of computers (electronics, hardware, theory or computation, networks, ...), tech history and culture (free software, hacker cutlure, free culture, ...), math and science in general, possibly even society, philosophy etc. Programming is not an isolated topic (only coding is), a programmer has to see the big picture and have a number of other big brain interests such as chess, voting systems, linguistics, physics, music etc. Remember, becoming a good programmer takes a whole life, sometimes even longer.

Can you become a good programmer when you're old? Well, as with everything to become a SERIOUSLY good programmer you should have probably started before the age of 20, the majority of the legend programmers started before 10, it's just like with sports or becoming an excellent musician. But with enough enthusiasm and endurance you can become a pretty good programmer at any age, just like you can learn to play an instrument or run marathon basically at any age, it will just take longer and a lot of energy. You don't even have to aim to become very good, becoming just average is enough to write simple gaymes and have a bit of fun in life :) Just don't try to learn programming because it seems cool, because you want to look like movie haxor, gain followers on youtube or because you need a job -- if you're not having genuine fun just thinking before sleep about how to swap two variables without using a temporary variable, programming is probably not for you.

Which programming language to start with? This is the big question. Though languages such as Python or JavaScript are objectively really REALLY bad, they are nowadays possibly the easiest way to get into programming, so you may want to just pick one of these two, knowing you'll abandon it later to learn a true language such as C (and knowing the bad language will still serve you in the future in some ways, it's not a wasted time). Can you start with C right away? It's probably not impossible for a genius but it will be VERY hard and you'll most likely end up failing, overwhelmed, frustrated and never returning to programming again. Absolutely do NOT even consider C# (shit, unusable), Java (shit, slow, bloated, unusable), C++ (like C but shit and more complicated), Haskell (non-traditional, hard), Rust (shit, bad design, unusable), Go (prolly hard), Lisp (non-traditional), Prolog (lol) and similar languages -- you may explore these later. Whichever language you pick for the love of god avoid OOP -- no matter what anyone tells you, when you see a tutorial that uses "classes"/"objects" just move on, learn normal imperative programming. OOP is a huge pile of shit meme that you will learn anyway later (because everyone writes it nowadays) so that you see why it's shit and why you shouldn't use it.

{ I really started programming in Pascal at school, it was actually a good language as it worked very similarly to C and the transition later wasn't that hard, but nowadays learning Pascal doesn't make much sense anymore. ~drummyfish }

Games are an ideal start project because they're fun (having fun makes learning much faster and enjoyable), there are many noob tutorials all over the Internet etc. However keep in mind to start EXTREMELY simple. -- this can't be stressed enough, most people are very impatient and eager and start making an RPG game or networking library without really knowing a programming language -- this is a GUARANTEED spectacular failure. At the beginning think in terms of "snake" and "minesweeper". Your very first project shouldn't even use any GUI, it should be purely command-line text program, so a text-only tiny interactive story in Python is possibly the absolutely best choice as a first project. Once you're more comfortable you may consider to start using graphics, e.g. Python + Pygame, but still KEEP IT SIMPLE, make a flappy bird clone or something. As you progress, consider perhaps buying a simple toy computer such as an open console -- these toys are closer to old computers that had no operating systems etc., they e.g. let you interact directly with hardware and teach you a LOT about good programming by teaching you how computers actually work under the hood. One day you will have to make the big step and learn C, the best and most important language as of yet, but be sure to only start learning it when you're at least intermediate in your start language (see our C tutorial). To learn C we recommend our SAF library which will save you all headaches of complex APIs and your games will be nice and compatible with you small toy computers.

TODO


programming_style

Programming Style

Here we discuss a good programming style (formatting, conventions etc.). Remember that nothing is set in stone, the most important thing is to be consistent and actually think about why you're doing things the way you're doing them. Think from the point of view of a programmer who gets just your source code without any way to communicate with you, make his life as easy as possible.

Recommended C Programming Style

This is our recommendation or perhaps just a suggestion/guide on the C programming style.

if (a == b)
{
  doSomething();
  doSomethingElse();
}
int a = x;
char b = y;

c += 3 * a;
d -= b;

if (c < d)
a = b;

programming_tips

Programming Tips

This is a place for sharing some practical programming tips.


proprietary

Proprietary

The word proprietary (related to the word property) is used for intellectual works (such as texts, songs, computer programs, ...) that are someone's fully owned "intellectual property" (by means of copyright, patents, trademarks etc.), i.e. those that are not free as in freedom because they cannot be freely copied, shared, modified, studied etc. This word has a negative connotation because proprietary works serve capitalist overlords, are used to abuse others and go against freedom. The opposite of proprietary is free (as in freedom, NOT price) (also libre): free works are either those that are completely public domain or technically owned by someone but coming with a free (as in freedom) license that voluntarily waives all the harmful legal rights of the owner. There are two main kinds of proprietary works (and their free counterparts): proprietary software (as software was the first area where these issues arose) (versus free software) and proprietary art of other kind (music, pictures, data, ...) (versus free cultural art).

As said, proprietary software is any software that is not free (as in freedom)/open source software. Such software denies users and creators their basic freedoms (freedom of unlimited use, studying, modifying and sharing) and is therefore evil; proprietary software is mostly capitalist software designed to abuse its user in some way. Proprietary code is often secret, not publicly accessible, but there are many programs whose source code is available but which is still proprietary because no one except the "owner" has any legal rights to fixing it, improving it or redistributing it.

Examples of proprietary software are MS Windows, MacOS, Adobe Photoshop and almost every game. Proprietary software is not only extremely harmful to culture, technology and society in general, it is downright dangerous and in some cases life-threatening; see for example cases of medical implants such as pacemakers running secret proprietary code whose creator and maintainer goes bankrupt and can no longer continue to maintain such devices already planted into bodies of people -- such cases have already appeared, see e.g. Autonomic Technologies nervous system implants.

Proprietary software licenses are usually called EULAs.

By extension besides proprietary software there also exist other proprietary works, for example proprietary art or databases -- these are all works that are not free cultural works. Even though for example a proprietary movie probably isn't IMMEDIATELY as dangerous as proprietary software, it may be just as dangerous to society in the long run. Examples of proprietary art is basically anything mainstream that's not older than let's say 50 years: Harry Potter, all Hollywood movies, basically all pop music, basically all AAA video game art and lore etcetc.


proprietary_software

Proprietary Software

Go here.


pseudo3d

Pseudo 3D

The term pseudo 3D, also 2.5D or primitive 3D, is used for computer graphics that tries to create the illusion of 3D rendering while in fact only utilizing simpler techniques; genuine 3D rendering is in this case called true 3D. On consumer computers it is nowadays mostly a thing of the past as everything including cell phones now has a powerful GPU capable or most advanced 3D rendering, nevertheless for suckless/KISS/LRS programming the techniques used in the past are very interesting and useful.

For example BSP rendering rendering in early games such as Doom is generally called pseudo 3D in the mainstream, however it is pretty debatable what exactly should classify as true 3D and what not because any computer rendering technique will inevitably have some kind of simplification of the true 3D reality of real life. And so the debate of "was Doom really 3D" arises. One side argues that in Doom's BSP rendering it for example wasn't possible to look up and down or have rooms above other rooms, all due to the limitations of the rendering system which this side sees as "not real 3D". However even modern 3D renderers have limitations such as mostly being able to only render models made out of triangles (while reality can have completely smooth shapes) or having a limited resolution of textures. Where to draw the line for "true 3D" is subjective -- we see it as reasonable to say that if it looks 3D, it IS 3D, i.e. we think Doom's graphics WAS really 3D, albeit limited. For this reason we also advise to rather use the term primitive 3D rather than pseudo 3D.

Techniques associated with primitive 3D are for example 2D raycasting, BSP rendering, mode7, parallax scrolling, voxel space terrain rendering or perspective-scaled sprites.

See Also


pseudoleft

Pseudoleft

See left vs right.


pseudominimalism

Pseudominimalism

Pseudominimalism is the kind of technology design which aims to appear minimalist on the outside while being bloated on the inside. Rather than trying to achieve a truly good, minimalist design from the ground up, with all its advantages, pseudominimalism just tries to hide the ugliness of its internals and appeal purely by the looks. A typical example could be a website that has a minimalist look -- a blank background with sans-serif font text and a few nice looking shapes -- which in the background uses dozens of JavaScript frameworks and libraries and requires a high end CPU to even appear responsive. Basically all modern "retro" video games are pseudominimalist in design, they use pixelated graphics but are created in huge frameworks such as Unity or Godot; even projects calling themselves "minimalist", such as many fantasy consoles, are in fact only pseudominimalist, written in extremely high level languages such as JavaScript. Apple is heavily practicing pseudominimalism.

Another example are many modern CLI programs that code monkeys use to impress their YouTube viewers or to feel like matrix haxors. Some people think that anything running in the command line is minimalist which is less and less true as we progress into the future. A lot of capitalist software add a CLI interface ex post on top of an already bloated program, often by simply disabling GUI (but leaving all its dependencies in). An example may be the gomux chat client.

Yet another kind of pseudominimalism appearing among the new generation of pseudoprogrammers is all about writing very few LOC in some hugely bloated language and calling that "minimalism". Something like a Minecraft clone in 100 LOC of Python using only Python standard library, the catch of course being that Python itself is hugely bloated and its standard library is enormous, therefore they just hide all the complexity out of view. Such effort is of course completely useless and only serves for flexing in front of beginners who can't spot the trick. Even if obvious, it has to be noted that minimalist software cannot be written in a bloated language.


public_domain_computer

Public Domain Computer

Public domain computer is yet nonexistent but planned and highly desired simple ethical computer (in the common meaning of the word) whose specification is completely in the public domain and which is made with completely selfless LRS-aligned goal of being absolutely non-malicious and maximally helpful to everyone. It should be the "people's computer", a simple, suckless, user-respecting hackable computer offering maximum freedom, a computer which anyone can study, improve, manufacture and repair without paying any "intellectual property" fees, a computer which people can buy (well, while money still exist) for extremely low price and use for any purpose without being abused or oppressed.

The project is basically about asking: what if computers were designed to serve us instead of corporations? Imagine a computer that wouldn't stand in your way in whatever you want to do.

In our ideal society, one of the versions of the public domain computer could be the less retarded watch.

Note that the computer has to be 100% from the ground up in the true, safe and worldwide public domain, i.e. not just "FOSS"-licensed, partially open etc. It should be created from scratch, so as to have no external dependencies and released safely to the public domain e.g. with CC0 + patent waivers. Why? In a good society there simply have to exist basic tools that aren't owned by anyone, tools simply available to everyone without any conditions, just as we have hammers, pencils, public domain mathematical formulas etc. -- computing has become an essential part of society and it certainly has to become a universal "human right", there HAS TO exist an ethical alternative to the oppressive capitalist technology so that people aren't forced to accepting oppression by their computers simply by lack of an alternative. Creating a public domain computer would have similarly positive effects to those of e.g. universal basic income -- with the simple presence of an ethical option the oppressive technology would have a competition and would have to start to behave a bit -- oppressive capitalist technology nowadays is possibly largely thanks to the conspiracy of big computer manufacturers that rely on people being de facto obliged to buy one of their expensive, proprietary, spyware littered non-repairable consumerist computer with secret internals.

The computer can (and should) be very simple. It doesn't -- and shouldn't -- try to be the way capitalist computers are, i.e. it would NOT be a typical computer "just in the public domain", it would be different by basic design philosophy because its goals would completely differ from those of capitalists. It would follow the LRS philosophy and be more similar to the very first personal computers rather than to the "modern" HD/bloated/superfast/fashion computers. Let us realize that even a very simple computer can help tremendously as a great number of tasks people need can actually be handled by pretty primitive computers -- see what communities do e.g. with open consoles.

Even a pretty simple computer without an operating system is able to:

Details

The project wouldn't aim to create a specific single "model" of a computer but rather blueprints that may be easily adjusted and mapped to any specific existing technology -- the goal would be to create an abstract hardware specification as well as basic software for the computer.

Abstract hardware specification means e.g. description on the logic gate level so that the computer isn't dependent on any contemporary and potentially proprietary lower level technology such as CMOS. The project would simply create a big logic circuit of the computer and this description could be compiled/synthesized to a lower level circuit board description. The hardware description could also be parametrized so that certain features could be adjusted -- for example it might be possible to choose the amount of RAM or disable specific CPU instructions to make a simpler, cheaper circuit board.

The computer would have to be created from the ground up, with every design aspect following the ultimate goal. The project roadmap could look similarly to this one:

  1. Design a simple instruction set architecture (ISA). This isn't that hard.
  2. With current free technology, e.g. C and GNU, create custom tools for designing, simulating and testing logic gate circuits. Also not extremely difficult if we keep it simple.
  3. With these tools design a simple MCU computer based on the above mentioned ISA. This is doable, there are hobbyists that have designed their own 8bit CPUs, a few collaborating people could definitely create a nice MCU if they keep it simple (no caching, no floating point, no GPUs, ...).
  4. Design a simple (e.g. FORTH-like) self-hosted programming language, create its compiler with support for the above mentioned ISA so that it is possible to write software for the computer. This may preceded or succeeded by adding the ISA support to an existing languages such as C, e.g. by adding a new backed to gcc. Again, pretty doable.
  5. Write basic software for the computer. EZ.
  6. Compile the MCU logic-level description to an actual circuitboard, possibly even with proprietary tools if other aren't available -- this may be fixed later.
  7. Manufacture the first physical computer, test it, debug it, improve it, give it to people, ...
  8. Now the main goal has been touched for the first time, however the real fun only begins -- now it is needed to spread the project, keep improving it, write advanced software such as an operating system etc.

See Also


public_domain

Public Domain

If an "intellectual work" (song, book, computer program, ...) is in the public domain (PD), it has no "owner", meaning no one has any exclusive rights (such as copyright or patent) over the work, no one can dictate how and by whom such work can be used and so anyone can basically do anything with such work (anything that's not otherwise illegal of course).

LRS highly supports public domain and recommends programmers and artists put their works in the public domain using waivers such as CC0.

Public domain is the ultimate form of freedom in the creative world. In public domain the creativity of people is not restricted. Anyone can study, remix, share and improve public domain works in any way, without a fear of being legally bullied by someone else.

Public domain is NOT the same thing as free (as in freedom) software, free culture or freeware (gratis, free as in beer) software. The differences are these:

Which Works Are In The Public Domain?

This is not a trivial question, firstly because the term public domain is not clearly defined: the definition varies by each country's laws, and secondly because it is non-trivial and sometimes very difficult to assess the legal status of a work.

Corporations and capitalism are highly hostile towards public domain and try to destroy it, make it effectively non-existing, as to eliminate "free" works competing with the consumerist creations of the industry. Over many years they have pushes towards creating laws that make it extremely difficult and rare for works to fall into public domain.

Sadly due to these shitty laws most works created in latest decades are NOT in the public domain because of the copyright cancer: copyright is granted automatically, without any registration or fee, to the author of any shitty artistic creation, and its term lasts mostly for the whole life of the author plus 70 years! In some countries this is life + 100 years. In the US, copyright lasts 96 years from the publication of the work (every January 1st there is so called public domain day celebrating new works entering the US public domain). In some countries it is not even possible to legally waive (give up) one's copyright. And to make matters worse, copyright isn't the only possible restriction of an intellectual work, there are also patents, trademarks, personality rights and other kinds of intellectual property.

Another bad news is that works in a "weak" public domain, i.e. most recent PD works or works that entered PD by some obscure little law, may as well stop being PD by introducing some shitty retroactive law (which has happened). So one may not be feeling completely safe going crazy by utilizing some recent PD works.

We therefore devise the term safe/strong public domain. Under this we include works that are pretty safely PD more or less world-wide, even considering possible changes in laws etc. Let us include these works:

Creative commons has created a public domain mark that helps mark and find works that should be in a world-wide public domain (this is not a waiver though, it is basically only used as a metadata for very old works to be better searchable).

There are a number of places on the internet to look for public domain works, for a list see below.

Should you release you own works to the public domain? Definitely yes! From our point of view public domain is the only option as we deem any "intellectual property" immoral, however even if you disagree with us, you may want to release at least some of your works into public domain, if only out of altruism, no longer caring about your old works, out of curiosity or to make yourself a bit popular in the free culture community (thought this is a motivation we don't entirely embrace). Are you afraid to do so? It is natural, letting go of something you spend part of your life on can raise a bit of anxiety, but this is just a fear of making the first step to the unknown, a fear almost entirely artificial, created by capitalist propaganda; making this decision will really most likely only have positive effects unless you actually had SERIOUS plans to make a business of your proprietary art. Practically the worst that can happen is that your work goes unnoticed and unappreciated. If you are still hesitant, try to go slowly, first release one thing, something small, and see what happens.

{ I remember myself how anxious I was about making the decision to release all my work into public domain, despite knowing it was the right thing to do and that I wanted to do it. I felt emotional about giving away rights to art I put so much love and energy into, fearing the evil vultures of the Internet would immediately "steal" it all as soon as I release it. I overcame the fear and now, many years later, I can say that not once have I regretted it, literally not a single case of abuse of my work happened (that I know of anyway), despite some of it becoming kind of popular. I only received love of many people who found my work useful, and even received donations from people. I've seen others put my work to use, improve it, I get mail from people thanking me for I've done. Of course this all is not why I did it, but it's nice, I write about it to share a personal experience that will maybe give you the courage to do the right thing as well. ~drummyfish }

How To Create Public Domain Works

To create a public domain work you must ensure that after you release it, no one will hold exclusive intellectual property rights to it -- most notably we will be trying to remove copyright from the work (which arises automatically, last extremely long and is most annoying), but know that there are potentially also other rights to take into account, e.g. patents, trade marks, trade dress, personality rights, etc. (in usual cases you don't have to deal with these as they apply only to some things in some situations, but for things like program source code you may need to look into them). We will remove such rights with licenses or waivers, i.e. a legal text which we attach to our works and which says we just give up our rights. Sadly this is not trivial to do.

If you want to create a PD work, then generally in that work you must not reuse any non-public domain work. So, for example, you can NOT create a public domain fan fiction story about Harry Potter because Harry Potter and his universe is copyrighted (your fan fiction here would be so called derivative work or a copyrighted work). Similarly you can't just use randomly googled images in a game you created because the images are most likely copyrighted. Small and obscure exceptions (trivial bitmap fonts, freedom of panorama, ...) to this may exist in laws but it's never good to rely on such quirky laws (they may differ between countries etc.), it's best to keep it safe and simply avoid utilizing anything non-PD within your works. If you can, create everything yourself, that's the safest bet.

Note that even things such as music/sound samples, text fonts or paint brushes may sometimes be copyrighted. Just be careful, try to make everything from scratch -- yes, it sucks, because copyright sucks, but this is simply how we bypass it. Making everything yourself from the ground up also teaches you a lot and makes your art truly original, it's not a wasted time.

Also you must NOT use anything under fair use! Even though you could lawfully use someone else's copyrighted work under fair use, inclusion of such material would, by the fair use rules, limit what others would be able to do with your work, making it restricted and therefore not public domain. Example: you can probably write a noncommercial Harry Potter fan fiction and share it with friends on the internet because that's fair use, however this fan fiction can never be public domain because it can't e.g. be used commercially, that would no longer fall under fair use, i.e. there is a non-commercial-use-only restriction burdening your work. It doesn't even help if you get an explicit permission to use a copyrighted work in your work unless such permission grants all the right to everyone (not just your work). { I got a mascot removed from SuperTuxKart by this argument, mere author's permission to use his work isn't enough to make it free as in freedom. ~drummyfish }

So you can only use your own original creations and other public domain works within your PD work. Here you should highly prefer your own creations because that is legally the safest, no one can ever challenge your right to reuse your own creation, but there is a low but considerable chance that someone else's PD work isn't actually PD or will seize to be PD by some retroactive law change. So when it only takes a small effort to e.g. photograph your own textures for a game instead of using someone else's PD textures, choose to use your own.

{ NOTE: The above is kind of arguing for reinventing wheels which goes a little bit against our philosophy of remixing and information sharing, but we are forced to do this by the system. We are forced to reinvent wheels to ensure that users of our works can't be legally bullied. ~drummyfish }

In cases where you DO reuse other PD works, try to minimize their number and try to make sure they belong to the actual safe public domain (see above). This again minimizes legal risk and additionally makes it easy to document and prove the sources.

As a next step make sure you clearly document your work and the sources you use. This means you write down where all the works contained in your work come from, e.g. in your readme. Explicitly mention which things you have created yourself ("I, ..., have created everything myself except for X, Y and Z") and which things come from other people and where you have found them. It is great to also archive the proofs of the third party source being public domain (e.g. use the Internet Archive to snapshot the page with a PD texture you've found). For works that allow it (e.g. source code, text, websites, ...) it is good to use version control systems such as git that record WHAT, WHEN and by WHO was contributed. This can all help prove that your work is actually safe and/or remove contributions that caused some legal trouble.

If you collaborate with someone on the work, it must be clear that ALL contributors to the work follow what we describe here (e.g. that they all agree to the license/waiver you have chosen etc.). It is safer if there are fewer contributors as with more people involved the chance of someone starting to "make trouble" increases.

Finally you need to actually release your work into the public domain. Remember that you want to achieve a safe, world-wide public domain (so again you shouldn't try to rely on some weird/obscure laws of your own small country). It must be stressed that it is NOT enough to write "my work is public domain", this is simply legally insufficient (and in many countries you can't even put your work into public domain which is why you need a more sophisticated tool). You need to use a public domain waiver (similar to a license) which you just put alongside your work (e.g. into the LICENSE file), plus it is also good to explicitly write (e.g. in your readme) a sentence such as "I, ..., release this work into public domain under CC0 1.0 (link), public domain". Keep in mind that the WORDING may be very important here, so try to write this well: we mention the license name AND its version (CC0 1.0, it may even be better to fully state Creative Commons 1.0) as well as a link to its exact text and also mention the words public domain afterwards to make the intent of public domain yet clearer to any doubters. Here we used what's currently probably the best waiver you can use: Creative Commons Zero (CC0) -- this is what we recommend. However note that CC0 only waives copyright and not other things like trademarks or patents, so e.g. for software you might need to add an extra waiver of these things as well.

{ I personally use the following waiver IN ADDITION to CC0 with my software to attempt waiving of patents, trademarks etc. I made it by taking some standard waiver companies use to steal "rights" of their employees and modifying it to make it a public domain waiver. If you want to use it, make sure you mention it is an EXTRA, additional waiver alongside CC0. The waiver text follows. ~drummyfish

Each contributor to this work agrees that they waive any exclusive rights, including but not limited to copyright, patents, trademark, trade dress, industrial design, plant varieties and trade secrets, to any and all ideas, concepts, processes, discoveries, improvements and inventions conceived, discovered, made, designed, researched or developed by the contributor either solely or jointly with others, which relate to this work or result from this work. Should any waiver of such right be judged legally invalid or ineffective under applicable law, the contributor hereby grants to each affected person a royalty-free, non transferable, non sublicensable, non exclusive, irrevocable and unconditional license to this right. }

NOTE: You may be thinking that it doesn't really matter if you waive your rights properly and very clearly if you know you simply won't sue anyone, you may think it's enough to just write "do whatever you want with my creation". But you have to remember others, and even you yourself, can't know if you won't change your mind in the future. A clear waiver is a legal guarantee you provide to others, not just a vague promise of someone on the Internet, and this guarantee is very valuable, so valuable that whether someone uses your work or not will often come down to this. So waiving your "rights" properly may increase the popularity and reusability of your work almost as much as the quality of the work itself.

For an example of a project project properly released into public domain see the repository of our LRS game Anarch.

Where To Find Public Domain Works

There are quite a few places on the Internet where you may find public domain works. But firstly let there be a warning: you always have to check the public domain status of works you find, it is extremely common for people on the Internet to not know what public domain is or how it works so you will find many false positives that are called public domain but are, in fact, not. This article should have given you a basic how-to on how to recognize and check public domain works. With this said, here is a list of some places to search (of course, this list will rot with time):


p_vs_np

P vs NP

P vs NP is one of the greatest and most important yet unsolved problems in computer science: it is the question of whether the computational class P is equal to class NP or, in simple terms, whether certain problems for which no "fast" solution is known can in fact be solved "fast". This is very important e.g. for algorithms used in cryptography. This problem is in fact so important that it's one of the seven Millennium Prize Problems. There is a million dollar reward for solving this problem.

It is believed and sometimes relied on that P != NP (in which case P would be a proper subset of NP), but a mathematical proof doesn't exist yet. If it was surprisingly proven that P = NP, there might be practical consequences for cryptography in which most algorithms rely on the problems in question being difficult (slow) to solve -- a proof of P = NP could lead to fast algorithms for breaking encryption, but that is not a certainty, only one of possible scenarios. However any solution to this problem would be revolutionary and ground breaking.

Explanation

In the context of computational complexity of algorithms we talk about different types of algorithm time complexities, i.e. different "speeds" of algorithms. This "speed" doesn't mean actual running time of the algorithm in real life but rather how quickly the running time grows depending on the amount of input data to it (so rather something akin "scalability"), i.e. we are interested only in the shape of the function that describes how the amount of input data affects the running time of the algorithm. The types of time complexity are named after mathematical functions that grow as quickly as this dependence, so we have a constant time complexity, logarithmic time complexity, linear time complexity etc.

Then we have classes of computational problems. The classes divide problems based on how "fast" they can be solved.

The class P stands for polynomial and is defined as all problems that can be solved by an algorithm run on a deterministic Turing machine (a theoretical computer) with a polynomial time complexity.

The class NP stands for non-deterministic polynomial and is defined as all problems that can be solved by an algorithm run on a non-deterministic Turing machine with a polynomial time complexity. I.e. the definition is the same as for the P class with the difference that the Turing machine is non-deterministic -- such a machine is faster because it can make kind of "random correct guesses" that lead to the solution more quickly. Non-deterministic computers are only theoretical (at least for now), computers we have in real life cannot perform such randomly correct guesses. It is known that the solution to all NP problems can be verified in polynomial time even by a deterministic Turing machine, we just don't know if the solution can also be found this quickly.

Basically P means "problems that can be solved quickly" and NP means "problems whose solutions can be verified quickly but we don't know if they can also be solved quickly".

The question is whether all NP problems are in fact P problems, i.e. whether all problems that can be verified quickly can also be solved quickly. It is believed this is not the case.


python

Python

What if pseudocode was actually code?

TODO


quantum_gate

Quantum Gate

{ Currently studying this, there may be errors. ~drummyfish }

Quantum (logic) gate is a quantum computing equivalent of a traditional logic gate. A quantum gate takes as an input N qubits and transforms their states to new states (this is different from classical logical gates that may potentially have a different number of input and output values).

Quantum gates are represented by complex matrices that transform the qubit states (which can be seen as points in multidimensional space, see Bloch sphere). A gate operating on N qubits is represented by a 2^Nx2^N matrix. These matrices have to be unitary. Operations performed by quantum gates may be reversed, unlike those of classical logic gates.

We normally represent a single qubit state with a column vector |a> = a0 * |0> + a1 * |1> => [a0, a1] (look up bra-ket notation). Multiple qubit states are represented as a tensor product of the individual state, e.g. |a,b> = [a0 * b0, a0 * b1, a1 * b0, a1 * b1]. Applying a quantum gate G to such a qubit vector q is performed by simple matrix multiplication: G * v.

Basic gates

Here are some of the most common quantum gates.

Identity

Acts on 1 qubit, leaves the qubit state unchanged.

1 0
0 1

Pauli Gates

Act on 1 qubit. There are three types of Pauli gates: X, Y and Z, each one rotates the qubit about the respective axis by pi radians.

The X gate is:

0 1
1 0

The Y gate is:

0 -i
i  0

The Z gate is:

1  0
0 -1

NOT

The not gate is identical to the Pauli X gate. It acts on 1 qubit and switches the probabilities of measuring 0 vs 1.

CNOT

Controlled NOT, acts on 2 qubits. Performs NOT on the second qubit if the first qubit is |1>.

1 0 0 0
0 1 0 0
0 0 0 1
0 0 1 0

TODO


quaternion

Quaternion

Quaternion is a type of number, just like there are integer numbers, real numbers or imaginary numbers. They are very useful for certain things such as 3D rotations (they have some advantages over using e.g. Euler angles, for example they avoid Gimbal lock, they are also faster than transform matrices etc.). Quaternions are not so easy to understand but you don't actually need to fully grasp and visualize how they work in order to use them if that's not your thing, there are simple formulas you can copy-paste to your code and it will "just work".

Quaternions are an extension of complex numbers (you should first check out complex numbers before tackling quaternions); while complex numbers can be seen as two dimensional -- having the real and imaginary part -- quaternions would be seen as four dimensional. A quaternion can be written as:

a + bi + cj + dk

where a, b, c and d are real numbers and i, j and k are the basic quaternion units. For the basic units it holds that

i^2 = j^2 = k^2 = ijk = -1

Why four components and not three? Simply put, numbers with three components don't have such nice properties, it just so happens that with four dimensions we get this nice system that's useful.

Operations with quaternions such as their multiplication can simply be derived using basic algebra and the above given axioms. Note that quaternion multiplication is non-commutative (q1 * q2 != q2 * q1), but it is still associative (q1 * (q2 * q3) = (q1 * q2) * q3).

A unit quaternion is a quaternion in which a^2 + b^2 + c^2 + d^2 = 1.

A quaternion negation (q^-1) is obtained by multiplying b, c and d by -1.

Rotations

Only unit quaternions represent rotations.

Rotating point p by quaternion q is done as

q^-1 * (0 + p.x i + p.y j + p.z k) * q

Rotation quaternion can be obtained from axis (v) and angle (a) as

q = cos(a/2) + sin(a/2) * (v.x i + v.y j + v.z k)

TODO: compare to euler angles, exmaples


qubit

Qubit

Qubit is a quantum computing equivalent of a bit. While bits in classical computers can have one of two state -- either 0 or 1 -- a qubit can additionally have infinitely many states "in between" 0 and 1 (so called superposition). Physically qubits can be realized thanks to quantum states of particles, e.g. the polarization of a photon or the spin of a photon. Qubits are processed with quantum gates.

Whenever we measure a qubit, we get either 1 or 0, just like with a normal bit. However during quantum computations the internal state of a qubit is more complex. This state determines the probabilities of measuring either 1 or 0. When the measurement is performed (which is basically any observation of its state), the qubit state collapses into one of those two states.

Now we will be dealing with so called pure states -- these are the states that can be expressed by the following representation. We will get to the more complex (mixed) states later.

The state of a qubit can be written as

A * |0> + B * |1>

where A and B are complex numbers such that A^2 + B^2 = 1, |0> is a vector [0, 1] and |1> is a vector [1, 0]. A^2 gives the probability of measuring the qubit in the state 0, B^2 gives the probability of measuring 1.

The vectors |0> and |1> use so called bra-ket notation and represent a vector basis of a two dimensional state. So the qubit space is a point in a space with two axes, but since A and B are complex, the whole space is four dimensional (there are 4 variables: A real, A imaginary, B real and B imaginary). However, since A + B must be equal to 1 (normalized), the point cannot be anywhere in this space. Using logic^TM we can figure out that the final state of a qubit really IS a point in two dimensions: a point on a sphere (Bloch sphere). A point of the sphere can be specified with two coordinates: phase (yaw, 0 to 2 pi, can be computed from many repeated measurements) and p (pitch, says he probability of measuring 1). It holds that:

A = sqrt(1 - p)

B = e^(i * phase) * sqrt(p)

The sphere has the state |0> at the top (north pole) and |1> at the bottom (south pole); these are the only points a normal bit can occupy. The equator is an area of states where the probability of measuring 0 and 1 are equal (above the equator gives a higher probability to 0, below the equator to 1).

Now a qubit may actually be in a more complex state than the pure states we've been dealing with until now. Pure states can be expressed with the state vector described above. Such a state is achieved when we start with a qubit of known value, e.g. if we cool down the qubit, we know it has the value |0>, and transforming this state with quantum gates keep the state pure. However there are also so called mixed states which are more complex and appear e.g. when the qubit may have randomly been modified by an external event, or if we start with a qubit of unknown state. Imagine if we e.g. start with a qubit that we known is either |0> or |1>. In such case we have to consider all those states separately. A mixed state is composed of multiple pure states. Mixed states can be expressed with so called density matrices, an alternative state representation which is able to encode these states.


quine

Quine

Quine is a nonempty program which prints its own source code. It takes no input, just prints out the source code when run (without cheating such as reading the source code file). Quine is basically a self-replicating program, just as in real world we may construct robots capable of creating copies of themselves (afterall we humans are such robots). The name quine refers to the philosopher Willard Quine and his paradox that shows a structure similar to self-replicating programs. Quine is one of the standard/fun/interesting programs such as hello world, 99 bottles of beer or fizzbuzz.

Quine can be written in any Turing complete language (according to Wikipedia), the challenge is in the self reference -- normally we cannot just single-line print a string literal containing the source because that string literal would have to contain itself, making it infinite in length. The idea commonly used to solve this problem is following:

  1. On first line start a definition of string S, later copy-paste to it the string on the second line.
  2. On second line put a command that prints the first line, assigning to S the string in S itself, and then prints S (the second line itself).

This is a quine in C:

#include <stdio.h>
char s[] = "#include <stdio.h>%cchar s[] = %c%s%c;%cint main(void) { printf(s,10,34,s,34,10,10); return 0; }";
int main(void) { printf(s,10,34,s,34,10,10); return 0; }

This is a quine in Python:

s="print(str().join([chr(115),chr(61),chr(34)]) + s + str().join([chr(34),chr(10)]) + s)"
print(str().join([chr(115),chr(61),chr(34)]) + s + str().join([chr(34),chr(10)]) + s)

This is a quine in comun:

0 46 32 34 S 34 32 58 83 S --> S: "0 46 32 34 S 34 32 58 83 S --> " .

TODO: more langs?

Yet a stronger quine is so called radiation hardened quine, a quine that remains quine even after any one character from the program has been deleted (found here in Ruby: https://github.com/mame/radiation-hardened-quine).

In the Text esoteric programming language every program is a quine (and so also a radiation hardened one).


race

Race

All races of men are to coexist in love and peace, despite their differences.

Races of people are very large, loosely defined groups of genetically similar (related) people. Races usually significantly differ by their look and in physical, mental and cultural aspects. The topic of human race is nowadays forbidden to be critically discussed and researched, however there at least exists a number of older research, information hidden in the underground and some things about race are completely obvious to those with an open mind. Good society, unlike for example our current capitalist society, acknowledges the differences between human races and lets them coexist peacefully in social equality despite their differences and without any need for bullshit such as political correctness.

Instead of the word race the politically correct camp uses words such as ethnicity -- it's funny, sometimes they say no such thing as race exists but other times they simply have to operate with the fact that people are genetically diverse, e.g. when they accuse others of racism, as existence of discrimination based on genetic differences between people necessarily implies the existence of genetic differences between people -- so here they try to substitute the word race for a different word so as to make their self-contradiction less obvious. Anyway, it doesn't work :)

Race can be told from the shape of the skull and one's DNA, which finds use e.g. in forensics to help solve crimes. It is officially called the ancestry estimation. Some idiots say this should be forbidden to do because it's "racist" lmao. Besides the obvious visual difference such as skin color races also have completely measurable differences acknowledged even by modern "science", for example unlike other races about 90% of Asians have dry earwax. Similar absolutely measurable differences exist in height, body odor, alcohol and lactose tolerance, high altitude tolerance, vulnerability to specific diseases, hair structure, cold tolerance, risk of obesity, behavior (see e.g. the infamous chimp out behavior of black people) and others. While dryness of earwax is really a minor curiosity, it is completely unreasonable to believe that race differences stop at traits we humans find unimportant and that genetics somehow magically avoids affecting traits that are harder to measure and which our current society deems politically incorrect to exist. In fact differences in important areas such as intelligence were measured very well -- these are however either censored or declared incorrect and "debunked" by unquestionable "science" authorities, because politics.

Pseudoleft uses cheap, logically faulty arguments to deny the existence of race; for example that there are no clear objective boundaries between races -- of course there are not, but how does that imply nonexistence of race? The same argument could also be given even e.g. for the term species (see e.g. ring species in which the boundaries are not clear) so as to invalidate it; yet we see no one doubting the existence of various species of animals. That's like saying that color doesn't exist because given any two distinct colors there exists a gradual transition, or that music and noise are the same thing because objectively no clear line can be drawn between them.

The politically correct camp further argues that there wasn't enough time for human races to develop significant differences as evolution operates on scales of millions of years while the evolution of modern humans was taking part about in an order of magnitude smaller time scale. However it has been shown that evolution can be extremely fast and make great changes in mere DECADES, e.g. in cases of rapid environment change (shown e.g. in a documentary Laws of the Lizard on anoles that show signs of evolutionary change only after 14 years, also see e.g. the book The 10,000 Year Explosion talking about actual acceleration of human evolution) and interbreeding with other species (e.g. Neanderthals, which European population bred with but African population didn't), which did occur when humans spread around the world and had to live in vastly different conditions -- successful civilizations themselves actually furthermore started to rapidly change their environment to something that favors very different traits. It has for example been found that average male brain increased from 1372 gram in 1860 to 1424 grams in 1940, a very significant change in LESS THAN A CENTURY. We can take a look at the enormous differences between dog breeds which have been bred mostly during only the last 200 years and whose differences are enormous and not only physical, but also that of intelligence and temperament -- yes, the breeding of dogs has been selective, but a rapid change in environment may have a similar accelerating effect, and the process in humans still took many tens of thousands of years. For example races of slaves were probably selectively bred, even if unintentionally, as physically fit slaves were more likely to survive than those who were smart; similarly in prospering civilizations, e.g. that of Europe, where trade, business and development of technology (e.g. military) became more crucial for survival than in primitive desert or jungle civilizations, different traits such as intelligence became preferred by evolution.

Another pseudoleftist argument is that "the DNA of any two individuals is 99.6 % identical so the differences are really insignificant". Now consider that DNA of a pig is 98 % identical to human. We see the argument is like saying a strawberry and beer is practically the same thing as they are both about 93 % water. It is known that only a minuscule part of DNA has any actual biological effect, only a small part is important and therefore including all the unimportant junk in judging similarity is just purposeful attempt at misleading statistics.

Denying the facts regarding human race is called race denialism, the acceptance of these facts is called race realism. Race denialism is part of the basis of today's pseudoleftist political ideology, theories such as polygenism (multiregional hypothesis) are forbidden to be supported and they're ridiculed and demonized by mainstream information sources like Wikipedia who only promote the politically correct "out of Africa" theory. SJWs reject any idea of a race with the same religious fanaticism with which Christian fanatics opposed Darwin's evolution theory.

What races are there? That depends on definitions, the boundaries between races are fuzzy and the lines can be drawn differently. The traditional, most general division still found in the greatest 1990s encyclopedias is to three large groups: Caucasoid (white), Negroid (black) and Mongoloid (yellow). These can be further subdivided. Some go as far as calling different nations separate races (e.g. the Norwegian race, Russian race etc.), thought that may be a bit of a stretch. One of the first scientific divisions of people into races was done by Francois Bernier in New Division of the Earth by the Different Species or "Races" of Man that Inhabit It into Europeans, Asians, Africans and Sami (north Europe), based on skin color, hair color, height and shape of face, nose and eyes.

There is a controversial 1994 book called The Bell Curve that deals with differences in intelligence between races (later followed by other books such as The Global Bell Curve trying to examine the situation world-wide). SJWs indeed tried to attack it, however international experts on intelligence agree the book is correct in saying average intelligence between races differs (see e.g. The Wall Street Journal's Mainstream Science on Intelligence). An online resource with a lot of information on racial differences is e.g. http://www.humanbiologicaldiversity.com/. See also e.g. https://en.metapedia.org/wiki/Race_and_morphology. Note that even though the mentioned sites may be fascist, biased and contain propaganda of their own, they provide links to resources which the pseudoleftist mainstream such as Wikipedia and Google simply censor -- while we may not promote the politics and opinions of mentioned sites, we link to them to provide access to censored information so that one can seek truth and form his own opinions.

If you want a relatively objective view on races, read old (pre 1950) books. See for example the article on NEGRO in 11th edition of Encyclopedia Britannica (1911), which clearly states on page 344 of the 19th volume that "Mentally the negro is inferior to the white" and continues to cite thorough study of this, finding that black children were quite intelligent but with adulthood the intellect always went down, however it states that negro has e.g. better sense of vision and hearing. Even in the 90s still the uncensored information on race was still available in the mainstream sources, e.g. the 1995 Desk Reference Encyclopedia and 1993 Columbia Encyclopedia still have articles on races and their differences.

{ Lol, the 1917 book The Circle of Knowledge has a detailed table comparing various races physically and mentally, stating things like "negro: slight mental development after puberty" etc. Encyclopedia Americana (1918) also mentions a detailed description of the negro, mentioning things such as much lower brain weight, prolonged arms, distinct odor and a lower face angle. ~drummyfish }

In relation to technology/math/science it is useful to know the differences in intellect between different races, though cultural and other traits linked to races may also play a big role. It is important to keep in mind intelligence isn't one dimensional, it's one of the most complex and complicated concepts we can be dealing with (remember the famous test that revealed that chimpanzees greatly outperform humans at certain intellectual tasks such as remembering the order of numbers seen for a very short period of time). We can't simplify to a single measure such as IQ score. Let intelligence here mean simply the ability to perform well in the area of our art. And of course, there are smart and stupid people in any race, the general statements we make are just about statistics and probabilities.

The smartest races in this regard seem to be Jews and Asians (also found so by the book Bell Curve), closely followed by the general white race. There is no question about the intelligence of Jews, the greatest thinkers of all times were Jewish (Richard Stallman, Einstein, Marx, Chomsky, even Jesus and others). Jews seem to have a very creative intelligence while Asians are more mechanically inclined, they can learn a skill and bring it to perfection with an extremely deep study and dedication. The African black race (in older literature known as the negro) is decisively the least intelligent -- this makes a lot of sense, the race has been oppressed and living in harsh conditions for centuries and didn't get much chance to evolve towards good performance in intellectual tasks, quite the opposite, those who were physically fit rather than smart were probably more likely to survive and reproduce as slaves or jungle people (even if white people split from the blacks relatively recently, a rapid change in environment also leads to a rapid change in evolution, even that of intelligence). 1892 book Hereditary Genius says that the black race is about two grades below the white race (nowadays the gap will most likely be lower). Hispanics were found to perform in between the white and black people. There isn't so much info about other races such as the red race or Eskimos, but they're probably similarly intelligent to the black race (The above mentioned book Hereditary Genius gives an intelligence of the Australian race at least one grade below that of the negro). The brown races are kind of complicated, the Indian people showed a great intellectual potential, e.g. in chess, math, philosophy (nonviolence inherently connected to India is the most intellectually advanced philosophy), and lately also computer science (even though many would argue that "pajeets" are just trained coding monkeys).

Increasing multiculturalism and mixing of the races will likely make all of this less and less relevant. But for now the differences still stand.

LRS philosophy is of course FOR multiculturalism and mixing of races. Biodiversity is good and it would probably also help reduce racial fascism/nationalism.

See Also


racism

Racism

The term racism has nowadays two main definitions, due to the onset of newspeak:

See Also


randomness

Randomness

Not to be confused with pseudorandomess.

TODO

Randomness Tests

TODO

One of the most basic is the chi-squared test whose description can be found e.g. in the Art of Computer Programming book. TODO

{ The following is a method I wrote about here (includes some code): https://codeberg.org/drummyfish/my_writings/src/branch/master/randomness.md, I am almost certainly not the first to invent this, but I haven't found what this is called, so for now I'm calling it "my" test, not implying any ownership of course :) If you know what this method is called, please send me a mail. ~drummyfish }

Drummyfish's randomness test: this test tries to measure the unpredictability, the inability to predict what binary digit will follow. As an input to the test we suppose a binary sequence S of length N bits that's repeating forever (for example for N = 2 a possible sequence is 10 meaning we are really considering an infinite sequence 1010101010...). We suppose an observer knows the sequence and that it's repeating (consider he has for example been watching us broadcast it for a long time and he noticed we are just repeating the same sequence over and over), then we ask: if the observer is given a random (and randomly long) subsequence S2 of the main sequence S, what's the average probability he can correctly predict the bit that will follow? This average probability is our measured randomness r -- the lower the r, the "more random" the sequence S is according to this test. For different N there are different minimum possible values of r, it is for example not possible to achieve r < 0.7 for N = 3 etc. The following table shows this test's most random sequences for given N, along with their count and r.

seq. len. most random looking sequences count min. r
1 0, 1 2 1.00
2 01, 10 2 0.50
3 001, 010, 011, 100, 101, 110 6 ~0.72
4 0011, 0110, 1001, 1100 4 ~0.78
5 00101, 01001, 01010, 01011, 01101, 10010, 10100, 10101, 10110, 11010 10 ~0.82
6 000101, 001010, 010001, 010100, 010111, 011101, 100010, 101000, 101011, 101110, 110101, 111010 12 ~0.86
7 0001001, 0010001, 0010010, 0100010, 0100100, 0110111, 0111011, 1000100, 1001000, 1011011, ... 14 ~0.88
8 00100101, 00101001, 01001001, 01001010, 01010010, 01011011, 01101011, 01101101, 10010010, ... 16 ~0.89
9 000010001, 000100001, 000100010, 001000010, 001000100, 010000100, 010001000, 011101111, ... 18 ~0.90
10 0010010101, 0010101001, 0100100101, 0100101010, 0101001001, 0101010010, 0101011011, ... 20 ~0.91
11 00010001001, 00010010001, 00100010001, 00100010010, 00100100010, 01000100010, 01000100100, ... 22 ~0.92
12 001010010101, 001010100101, 010010100101, 010010101001, 010100101001, 010100101010, ... 24 ~0.92
13 0010010100101, 0010100100101, 0010100101001, 0100100101001, 0100101001001, 0100101001010, ... 26 ~0.93
... ... ... ...

Truly Random Sequence Example

WORK IN PROGRESS { Also I'm not too good at statistics lol. ~drummyfish }

Here is a sequence of 1000 bits which we most definitely could consider truly random as it was generated by physical coin tosses:

{ The method I used to generate this: I took a plastic bowl and 10 coins, then for each round I threw the coins into the bowl, shook them (without looking, just in case), then rapidly turned it around and smashed it against the ground. I took the bowl up and wrote the ten generated bits by reading the coins kind of from "top left to bottom right" (heads being 1, tails 0). ~drummyfish }

00001110011101000000100001011101111101010011100011
01001101110100010011000101101001000010111111101110
10110110100010011011010001000111011010100100010011
11111000111011110111100001000000001101001101010000
11111111001000111100100011010110001011000001001000
10001010111110100111110010010101001101010000101101
10110000001101001010111100100100000110000000011000
11000001001111000011011101111110101101111011110111
11010001100100100110001111000111111001101111010010
10001001001010111000010101000100000111010110011000
00001010011100000110011010110101011100101110110010
01010010101111101000000110100011011101100100101001
00101101100100100101101100111101001101001110111100
11001001100110001110000000110000010101000101000100
00110111000100001100111000111100011010111100011011
11101111100010111000111001010110011001000011101000
01001111100101001100011100001111100011111101110101
01000101101100010000010110110000001101001100100110
11101000010101101111100111011011010100110011110000
10111100010100000101111001111011010110111000010101

Let's now take a look at how random the sequence looks, i.e. basically how likely it is that by generating random numbers by tossing a coin will give us a sequence with statistical properties (such as the ratio of 1s and 0s) that our obtained sequence has.

There are 494 1s and 506 0s, i.e. the ratio is approximately 0.976, deviating from 1.0 (the value that infinitely many coin tosses should converge to) by only 0.024. We can use the binomial distribution to calculate the "rarity" of getting this deviation or higher one; here we get about 0.728, i.e. a pretty high probability, meaning that if we perform 1000 coin tosses like the one we did, we may expect to get the deviation we got or higher in more than 70% of cases (if on the other hand we only got e.g. 460 1s, this probability would be only 0.005, suggesting the coins we used weren't fair). If we take a look at how the ratio (rounded to two fractional digits) evolves after each round of performing additional 10 coin tosses, we see it gets pretty close to 1 after only about 60 tosses and stabilizes quite nicely after about 100 tosses: 0.67, 0.54, 0.67, 0.90, 0.92, 1.00, 0.94, 0.90, 0.88, 1.00, 1.04, 1.03, 0.97, 1.00, 0.97, 1.03, 1.10, 1.02, 0.98, 0.96, 1.02, 1.02, 1.02, 1.00, 0.95, 0.95, 0.99, 0.99, 0.99, 0.97, 0.95, 0.95, 0.96, 0.93, 0.90, 0.88, 0.90, 0.93, 0.95, 0.98, 0.98, 0.97, 0.97, 0.99, 1.00, 0.98, 0.98, 0.98, 0.97, 0.96, 0.95, 0.94, 0.95, 0.95, 0.96, 0.95, 0.96, 0.95, 0.96, 0.95, 0.96, 0.95, 0.96, 0.96, 0.97, 0.97, 0.97, 0.95, 0.94, 0.93, 0.93, 0.93, 0.94, 0.94, 0.94, 0.96, 0.95, 0.96, 0.96, 0.95, 0.96, 0.95, 0.95, 0.96, 0.97, 0.97, 0.96, 0.96, 0.95, 0.95, 0.95, 0.96, 0.97, 0.97, 0.97, 0.97, 0.96, 0.97, 0.98, 0.98.

Let's try the chi-squared test (the kind of basic "randomness" test): D = (494 - 500)^2 / 500 + (506 - 500)^2 / 500 = 0.144; now in the table for the chi square distribution for 1 degree of freedom (i.e. two categories, 0 and 1, minus one) we see this value of D falls somewhere around 30%, which is not super low but not very high either, so we can see the test doesn't invalidate the hypothesis that we got numbers from a uniform random number generator. { I did this according to Knuth's Art of Computer Programming where he performed a test with dice and arrived at a number between 25% and 50% which he interpreted in the same way. For a scientific paper such confidence would of course be unacceptable because there we try to "prove" the validity of our hypothesis. Here we put much lower confidence level as we're only trying not fail the test. To get a better confidence we'd probably have to perform many more than 1000 tosses. ~drummyfish }

We can try to convert this to a sequence of integers of different binary sizes and just "intuitively" see if the sequences still looks random, i.e. if there are no patterns such as e.g. the numbers only being odd or the histograms of the sequences being too unbalanced, we could also possibly repeat the chi-squared test etc.

The sequence as 100 10 bit integers (numbers from 0 to 1023) is:

57 832 535 501 227 311 275 90 267 1006
730 155 273 874 275 995 759 528 52 848
1020 572 565 556 72 555 935 805 309 45
704 842 969 24 24 772 963 479 695 759
838 294 241 998 978 548 696 337 29 408
41 774 429 370 946 330 1000 104 886 297
182 293 719 308 956 806 398 12 84 324
220 268 911 107 795 958 184 917 612 232
318 332 451 911 885 278 784 364 52 806
929 367 630 851 240 753 261 926 859 533

As 200 5 bit integers (numbers from 0 to 31):

1 25 26 0 16 23 15 21 7 3 9 23 8 19 2 26 8 11 31 14
22 26 4 27 8 17 27 10 8 19 31 3 23 23 16 16 1 20 26 16
31 28 17 28 17 21 17 12 2 8 17 11 29 7 25 5 9 21 1 13
22 0 26 10 30 9 0 24 0 24 24 4 30 3 14 31 21 23 23 23
26 6 9 6 7 17 31 6 30 18 17 4 21 24 10 17 0 29 12 24
1 9 24 6 13 13 11 18 29 18 10 10 31 8 3 8 27 22 9 9
5 22 9 5 22 15 9 20 29 28 25 6 12 14 0 12 2 20 10 4
6 28 8 12 28 15 3 11 24 27 29 30 5 24 28 21 19 4 7 8
9 30 10 12 14 3 28 15 27 21 8 22 24 16 11 12 1 20 25 6
29 1 11 15 19 22 26 19 7 16 23 17 8 5 28 30 26 27 16 21

Which has the following histogram:

number: 0  1  2  3  4  5  6  7  8  9  10 11 12 13 14 15
count:  6  6  3  6  5  5  7  5  11 10 7  6  7  3  4  5 

number: 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
count:  7  9  3  5  4  8  7  8  9  4  8  6  8  6  6  6

And as 250 4 bit integers (numbers from 0 to 15):

0 14 7 4 0 8 5 13 15 5 3 8 13 3 7 4 4 12 5 10 4 2 15 14 14
11 6 8 9 11 4 4 7 6 10 4 4 15 14 3 11 13 14 1 0 0 13 3 5 0
15 15 2 3 12 8 13 6 2 12 1 2 2 2 11 14 9 15 2 5 4 13 4 2 13
11 0 3 4 10 15 2 4 1 8 0 6 3 0 4 15 0 13 13 15 10 13 14 15 7
13 1 9 2 6 3 12 7 14 6 15 4 10 2 4 10 14 1 5 1 0 7 5 9 8
0 10 7 0 6 6 11 5 7 2 14 12 9 4 10 15 10 0 6 8 13 13 9 2 9
2 13 9 2 5 11 3 13 3 4 14 15 3 2 6 6 3 8 0 12 1 5 1 4 4
3 7 1 0 12 14 3 12 6 11 12 6 15 11 14 2 14 3 9 5 9 9 0 14 8
4 15 9 4 12 7 0 15 8 15 13 13 5 1 6 12 4 1 6 12 0 13 3 2 6
14 8 5 6 15 9 13 11 5 3 3 12 2 15 1 4 1 7 9 14 13 6 14 1 5

This has the following histogram:

number: 0  1  2  3  4  5  6  7  8  9  10 11 12 13 14 15
count:  18 14 19 18 23 15 18 11 11 14 9  10 13 20 18 19

Another way to test data randomness may be by trying to compress it, since compression is basically based on removing regularities, redundancy, leaving only randomness. A compression algorithm exploits correlations in input data and removes that which can later be reasoned out from what's left, but with a completely random data nothing should be correlated, it shouldn't be possible to reason out parts of such data from other parts of that data, hence compression can remove nothing and it shouldn't generally be possible to compress completely random data (though of course there exists a non-zero probability that in rare cases random data will have regular structure and we will be able to compress it). Let us try to perform this test with the lz4 compression utility -- we convert our 1000 random bits to 125 random bytes and try to compress them. Then we will try to compress another sequence of 125 bytes, this time a non-random one -- a repeated alphabet in ASCII (abcdefghijklmnopqrstuvwxyzabcdef...). Here are the results:

sequence (125 bytes) compressed size
our random bits 144 (115.20%)
abcdef... 56 (44.80%)

We see that while the algorithm was able to compress the non-random sequence to less than a half of the original size, it wasn't able to compress our data, it actually made it bigger! This suggests the data is truly random. Of course it would be good to test multiple compression algorithms and see if any one of them finds some regularity in the data, but the general idea has been presented.


rapeware

Rapeware

Rapeware is an aggressive malware, i.e. mostly proprietary software such as Windows, consumer games etc.


rationalwiki

"Rational"Wiki

RationalWiki (https://rationalwiki.org/) is a toxic child atheist pseudorationalist/pseudoskeptic SJW wiki website that specializes in attacking rational views on controversial topics on the Internet. It is recommended you delete this website from your bookmarks or it will probably give you brain cancer.

Typically for a pseudoleftist site, a tactic of using misleading names is widely used as one of the main means of operation, e.g. racial realism is called racism, politically inconvenient science is called pseudoscience and, of course, soyence is promoted as the "only true unquestionable science". The name of the wiki itself seems to suggest it has something to do with rationality, which is of course very misleading, if not a downright lie -- the purpose of the wiki seems to be solely promotion of modern harmful religion and cults such as capitalism and political correctness, while bashing anything slightly off the mainstream.

See Also


raycasting

Raycasting

In computer graphics raycasting refers to a rendering technique in which we determine which parts of the scene should be drawn according to which parts of the scene are hit by rays cast from the camera. This is based on the idea that we can trace rays of light that enter the camera by going backwards, i.e. starting from the camera towards the parts of the scene that reflected the light. The term raycasting specifically has two main meanings:

2D Raycasting

{ We have an official LRS library for advanced 2D raycasting: raycastlib! And also a game built on top of it: Anarch. ~drummyfish }

2D raycasting can be used to relatively easily render "3Dish" looking environments (commonly labeled "pseudo 3D"), mostly some kind of right-angled labyrinth. There are limitations such as the inability for the camera to tilt up and down (which can nevertheless be faked with shearing). It used to be popular in very old games but can still be used nowadays for "retro" looking games, games for very weak hardware (e.g. embedded), in demos etc. It is pretty cool, very suckless rendering method.

....................................................................................................
....................................................................................................
###.................................................................................................
#########...........................................................................................
#########...........................................................................................
#########...........................................................................................
#########...........................................................................................
#########.................///######..................................../#...........................
#########..............//////############.....................//////////###.........................
#########..............//////############............///////////////////####............////////////
#########......///#####//////############.........//////////////////////####////////////////////////
###############///#####//////############.........//////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############//#####////////////////////////####////////////////////////
###############///#####//////############.........//////////////////////####////////////////////////
#########......///#####//////############.........//////////////////////####////////////////////////
#########..............//////############............///////////////////####............////////////
#########..............//////############.....................//////////###.........................
#########.................///######..................................../#...........................
#########...........................................................................................
#########...........................................................................................
#########...........................................................................................
#########...........................................................................................
###.................................................................................................
....................................................................................................
....................................................................................................

raycasted view, rendered by the example below

The method is called 2D because even though the rendered picture looks like a 3D view, the representation of the world we are rendering is 2 dimensional (usually a grid, a top-down plan of the environment with cells of either empty space or walls) and the casting of the rays is performed in this 2D space -- unlike with the 3D raycasting which really does cast rays in fully 3D environments. Also unlike with the 3D version which casts one ray per each rendered pixel (x * y rays per frame), 2D raycasting only casts one ray per rendered column (x rays per frame) which actually, compared to the 3D version, drastically reduces the number of rays cast and makes this method fast enough for real time rendering even using software_rendering (without a GPU).

The principle is following: for each column we want to render we cast a ray from the camera and find out which wall in our 2D world it hits first and at what distance -- according to the distance we use perspective to calculate how tall the wall columns should look from the camera's point of view, and we render the column. Tracing the ray through the 2D grid representing the environment can be done relatively efficiently with algorithms normally used for line rasterization. There is another advantage for weak-hardware computers: we can easily use 2D raycasting without a framebuffer (without double_buffering) because we can render each frame top-to-bottom left-to-right without overwriting any pixels (as we simply cast the rays from left to right and then draw each column top-to-bottom). And of course, it can be implemented using fixed point (integers only).

The classic version of 2D raycasting -- as seen in the early 90s games -- only renders walls with textures; floors and ceilings are untextured and have a solid color. The walls all have the same height, the floor and ceiling also have the same height in the whole environment. In the walls there can be sliding doors. 2D sprites (billboards) can be used with raycasting to add items or characters in the environment -- for correct rendering here we usually need a 1 dimensional z-buffer in which we write distances to walls to correctly draw sprites that are e.g. partially behind a corner. However we can extend raycasting to allow levels with different heights of walls, floor and ceiling, we can add floor and ceiling texturing and, in theory, probably also use different level geometry than a square grid (however at this point it would be worth considering if e.g. BSP rendering wouldn't be better).

Implementation

The core element to implement is the code for casting rays, i.e. given the square plan of the environment (e.g. game level), in which each square is either empty or a wall (which can possibly be of different types, to allow e.g. different textures), we want to write a function that for any ray (defined by its start position and direction) returns the information about the first wall it hits. This information most importantly includes the distance of the hit, but can also include additional things such as the type of the wall, texturing coordinate or its direction (so that we can shade differently facing walls with different brightness for better realism). The environment is normally represented as a 2 dimensional array, but instead of explicit data we can also use e.g. a function that procedurally generates infinite levels (i.e. we have a function that for given square coordinates computes what kind of square it is). As for the algorithm for tracing the ray in the grid we may actually use some kind of line rasterization algorithm, e.g. the DDA algorithm (tracing a line through a grid is analogous to drawing a line in a pixel grid). This can all be implemented with fixed point, i.e. integer only! No need for floating point.

Note on distance calculation and distortion: When computing the distance of ray hit from the camera, we usually DO NOT want to use the Euclidean distance of that point from the camera position (as is tempting) -- that would create a so called fish eye effect, i.e. looking straight into a perpendicular wall would make the wall look warped/bowled (as the part of the wall in the middle of the screen is actually closer to the camera position so it would, by perspective, look bigger). For non-distorted rendering we have to compute a distance that's perpendicular to the camera plane -- we can see the camera plane as a "canvas" onto which we project the scene, in 2D it is a line (unlike in 3D where it really is a plane) at a certain distance from the camera (usually conveniently chosen to be e.g. 1) whose direction is perpendicular to the direction the camera is facing. The good news is that with a little trick this distance can be computed even more efficiently than Euclidean distance, as we don't need to compute a square root! Instead we can utilize the similarity of triangles. Consider the following situation:

                I-_ 
               /   '-X
              /  r.'/|
      '-._   /  ,' / |
      p   '-C_.'  /  |
          1/,'|-./   |
          /'  | /    |
         V-._-A/-----B
             'J
              

In the above V is the position of the camera (viewer) which is facing towards the point I, p is the camera plane perpendicular to VI at the distance 1 from V. Ray r is cast from the camera and hits the point X. The length of the line r is the Euclidean distance, however we want to find out the distance JX = VI, which is perpendicular to p. There are two similar triangles: VCA and VIB; from this it follows that 1 / VA = VI / VB, from which we derive that JX = VB / VA. We can therefore calculate the perpendicular distance just from the ratio of the distances along one principal axis (X or Y). However watch out for the case when VA = VB = 0 to not divide by zero! In such case use the other principal axis (Y).

Here is a complete C example that uses only fixed point with the exception of the stdlib sin/cos functions, for simplicity's sake (these can easily be replaced by custom fixed point implementation):

#include <stdio.h>
#include <math.h>     // for simplicity we'll use float sin, cos from stdlib

#define U 1024        // fixed-point unit
#define LEVEL_SIZE 16 // level resolution
#define SCREEN_W 100
#define SCREEN_H 31

int wallHeight[SCREEN_W];
int wallDir[SCREEN_W];

int perspective(int distance)
{
  if (distance <= 0)
    distance = 1;

  return (SCREEN_H * U) / distance;
}

unsigned char level[LEVEL_SIZE * LEVEL_SIZE] =
{
#define E 1, // wall
#define l 0, // floor
 l l l l E l l l l l l l l l E E 
 l E l l E E E l l l l l E l l E 
 l l l l l l l l l l l l l l l l 
 l E l l E l E l E l E l E l l l 
 l l l l E l l l l l l l l l E l 
 l l l l E l l l l l l l l l E l 
 l E E l E l l l l l l l l l l l 
 l E E l E l l l l l l l l l l l 
 l E l l l l l l l l l l l l l E 
 l E l l E l l l l l l l l E l l 
 l E l l E l l l l l l l l E l l 
 l E l l l l E E E l l l l l l l 
 l E E l E l l l l l E E E l l E 
 l E E l E l l l l l E l l l E E 
 l l l l l l E E E E E l l E E E 
 l l E l l l l l l l l l E E E E 
#undef E
#undef l
};

unsigned char getTile(int x, int y)
{
  if (x < 0 || y < 0 || x >= LEVEL_SIZE || y >= LEVEL_SIZE)
    return 1;

  return level[y * LEVEL_SIZE + x];
}

// returns perpend. distance to hit and wall direction (0 or 1) in dir
int castRay(int rayX, int rayY, int rayDx, int rayDy, int *dir)
{
  int tileX = rayX / U, 
      tileY = rayY / U,
      addX = 1, addY = 1;

  // we'll convert all cases to tracing in +x, +y direction

  *dir = 0;

  if (rayDx == 0)
    rayDx = 1;
  else if (rayDx < 0)
  {
    rayDx *= -1;
    addX = -1;
    rayX = (tileX + 1) * U - rayX % U;
  }

  if (rayDy == 0)
    rayDy = 1;
  else if (rayDy < 0)
  {
    rayDy *= -1;
    addY = -1;
    rayY = (tileY + 1) * U - rayY % U;
  } 

  int origX = rayX, 
      origY = rayY;

  for (int i = 0; i < 20; ++i) // trace at most 20 squares
  {
    int px = rayX % U, // x pos. within current square
        py = rayY % U,
        tmp;

    if (py > ((rayDy * (px - U)) / rayDx) + U)
    {
      tileY += addY; // step up
      rayY = ((rayY / U) + 1) * U;

      tmp = rayX / U;
      rayX += (rayDx * (U - py)) / rayDy;

      if (rayX / U != tmp) // don't cross the border due to round. error
        rayX = (tmp + 1) * U - 1;

      *dir = 0;
    }
    else
    {
      tileX += addX; // step right
      rayX = ((rayX / U) + 1) * U;

      tmp = rayY / U;
      rayY += (rayDy * (U - px)) / rayDx;

      if (rayY / U != tmp)
        rayY = (tmp + 1) * U - 1;

      *dir = 1;
    }

    if (getTile(tileX,tileY)) // hit?
    {
      px = rayX - origX;
      py = rayY - origY;

      // get the perpend dist. to camera plane:
      return (px > py) ? ((px * U) / rayDx) : ((py * U) / rayDy);
      
      // the following would give the fish eye effect instead
      // return sqrt(px * px + py * py);
    }
  }

  return 100 * U; // no hit found
}

void drawScreen(void)
{
  for (int y = 0; y < SCREEN_H; ++y)
  {
    int lineY = y - SCREEN_H / 2;

    lineY = lineY >= 0 ? lineY : (-1 * lineY);

    for (int x = 0; x < SCREEN_W; ++x)
      putchar((lineY >= wallHeight[x]) ? '.' : (wallDir[x] ? '/' : '#'));

    putchar('\n');
  }
}

int main(void)
{
  int camX = 10 * U + U / 4,
      camY = 9 * U + U / 2,
      camAngle = 600, // U => full angle (2 * pi)
      quit = 0;

  while (!quit)
  {
    int forwX = cos(2 * 3.14 * camAngle) * U,
        forwY = sin(2 * 3.14 * camAngle) * U,
        vecFromX = forwX + forwY, // leftmost ray
        vecFromY = forwY - forwX,
        vecToX = forwX - forwY,   // rightmost ray
        vecToY = forwY + forwX;

    for (int i = 0; i < SCREEN_W; ++i) // process each screen column
    {
      // interpolate rays between vecFrom and vecTo
      int rayDx = (SCREEN_W - 1 - i) * vecFromX / SCREEN_W + (vecToX * i) / SCREEN_W,
          rayDy = (SCREEN_W - 1 - i) * vecFromY / SCREEN_W + (vecToY * i) / SCREEN_W,
          dir,
          dist = castRay(camX,camY,rayDx,rayDy,&dir);

      wallHeight[i] = perspective(dist);
      wallDir[i] = dir;
    }

    for (int i = 0; i < 10; ++i)
      putchar('\n');

    drawScreen();

    char c = getchar();

    switch (c) // movement
    {
      case 'a': camAngle += 30; break;
      case 'd': camAngle -= 30; break;
      case 'w': camX += forwX / 2; camY += forwY / 2; break;
      case 's': camX -= forwX / 2; camY -= forwY / 2; break;
      case 'q': quit = 1; break;
      default: break;
    }
  }

  return 0;
}

How to make this more advanced? Here are some hints and tips:


raycastlib

Raycastlib

TODO

The repository is available at https://codeberg.org/drummyfish/raycastlib.


raylib

Raylib

Raylib is a free, relatively KISS, portable C (C99) library intended mainly for game development, offering IO handling, 2D and 3D graphics, audio, loading of different image and 3D formats etc., while restraining from a lot of bullshit of "modern" bloated engines/frameworks such as having tons of dependencies and targeting only very fast computing platforms. Raylib is pretty cool and employs many of the LRS/suckless ideas, even though from our strict point of view it is still a bit more complex than it really needs to be, e.g. by using floating point and relying on GPU accelerated 3D graphics. In terms of bloat it can be seen as a mid way between the mainstream (e.g. Godot) and LRS/suckless (e.g. small3dlib).

The following are some features of raylib as of writing this. The good and neutral features seem to be:

And some of the bad features are:


reactionary_software

Reactionary Software

{ The "founder", fschmidt, sent me a link to his website on saidit after I posted about LRS. Here is how I interpret his take on technology -- as always I may misinterpret or distort something, for safety refer to the original website. ~drummyfish }

Reactionary software (reactionary meaning opposing the modern, favoring the old) is a kind of software/technology philosophy opposing modern technology and advocating simplicity as a basis of good technology (and possibly whole society); it is similar e.g. to suckless and our own less retarded software, though it's not as "hardcore" minimalist (e.g. it's okay with old versions of Java which we still consider kind of bloated and therefore bad). Just as suckless and LRS, reactionary software notices the unbelievably degenerated state of "modern" technology (reflecting the degenerate state of whole society) manifested in bloat, overengineering, overcomplicating, user abuse, ugliness, DRM, bullshit features, planned obsolescence, fragility etc., and advocates for rejecting it, for taking a step back to when technology was still sane (before 2000s). The website of reactionary software is at http://www.reactionary.software (on top it reads Make software great again!). There is also a nice forum at http://www.mikraite.org/Reactionary-Software-f1999.html (tho requires JS to register? WTF).

The biggest difference compared to suckless/LRS is that reactionary software focuses on the simplicity from user's point of view (as stated on their forums). This is not in conflict with our views, we just additionally see the simplicity of internals as just as important.

The founder of reactionary software is fschmidt and he still seems to be the one who mostly defines it (just like drummyfish is at the moment basically solo controlling LRS), though there is a forum of people who follow him. The philosophy can potentially be extended beyond just software, to other fields of endeavor and potentially whole society -- the discussion of reactionary software revolves around wide context, e.g. things like philosophy, religion and collapse of society (fschmidt made a post where he applies Old Testament ideas to programming).

fschmidt seems to be a lot into religion and also has some related side projects with wider scope, e.g. Arkians which deals with society and eugenics. It seems to be trying to establish a community of "chosen people" (those who pass certain tests) who selective breed to renew good genes in society. { PLEASE DON'T JUMP TO CONCLUSIONS, I just quickly skimmed through it -- people will probably freak out and start calling that guy a Nazi -- please don't, read his site first. I can't really say more about it as I didn't research it well, but he doesn't seem to be proposing violent solutions. Peace. ~drummyfish }

What do we think about reactionary software? The vibes are good, it basically seems like "lightweight suckless" -- we agree with what they identify as causes of decline of modern technology, we like that they discuss wide context and the big picture and our solutions are aligned, in the same direction -- theirs are just not as radical, or maybe we just disagree on minor points. We may e.g. disagree on specific cases of software, for example they approve of old Python, Java and lightweight JavaScript used on the web -- we see such software as unacceptable, it's too complex, unnecessary and from ground up designed badly. { As clarified on the forums, reactionary software focuses on the simplicity from user's perspective, not necessarily the simplicity of internals. ~drummyfish } Nevertheless we definitely see it as good this philosophy exists, it contributes to improving technology and it may provide an alternative to people who suffer from modern tech but suckless or LRS is too difficult for them to get into. The fact that more and more smaller communities with ideas similar to LRS come to life indicates the ideas themselves are alive and start to flourish, in a decentralized way -- this is good.

Examples of reactionary software include (examples from the site itself):

See Also


README

Less Retarded Wiki

This is online at https://www.tastyfish.cz/lrs/main.html.

Wiki about less retarded software and related topics.

By contributing you agree to release your contribution under CC0 1.0, public domain (https://creativecommons.org/publicdomain/zero/1.0/). Please do not add anything copyrighted to this Wiki (such as copy pasted texts from elsewhere, images etc.).

Start reading at the main page.


recursion

Recursion

See recursion.

Recursion (from Latin recursio, "running back") in general is a situation in which a definition refers to itself; for example the definition of a human's ancestor as "the human's parents and the ancestors of his parents" (fractals are also very nice example of what a simple recursive definition can achieve). In programming recursion takes on a meaning of a function that calls itself; this is the meaning we'll suppose in this article, unless noted otherwise.

We divide recursion to a direct and indirect one. In direct recursion the function calls itself directly, in indirect function A calls a function B which ends up (even possibly by calling some more functions) calling A again. Indirect recursion is tricky because it may appear by mistake and cause a bug (which is nevertheless easily noticed as the program will mostly run out of memory and crash).

When a function calls itself, it starts "diving" deeper and deeper and in most situations we want this to stop at some point, so in most cases a recursion has to contain a terminating condition. Without this condition the recursion will keep recurring and end up in an equivalent of an infinite loop (which in case of recursion will however crash the program with a stack overflow exception). Let's see this on perhaps the most typical example of using recursion, a factorial function:

unsigned int factorial(unsigned int x)
{
  if (x > 1)
    return x * factorial(x - 1); // recursive call
  else
    return 1; // terminating condition
}

See that as long as x > 1, recursive calls are being made; with each the x is decremented so that inevitably x will at one point come to equal 1. Then the else branch of the condition will be taken -- the terminating condition has been met -- and in this branch no further recursive call is made, i.e. the recursion is stopped here and the code starts to descend from the recursion.

Note that even in computing we can use an infinite recursion sometimes. For example in Hashell it is possible to define infinite data structures with a recursive definition; however this kind of recursion is intentionally allowed, it is treated as a mathematical definition and with correct use it won't crash the program.

Every recursion can be replaced by iteration and vice versa (iteration meaning a loop such as while). In fact some language (e.g. functional) do not have loops and handle repetition solely by recursion. This means that you, a programmer, always have a choice between recursion and iteration, and here you should know that recursion is typically slower than iteration. This is because recursion has a lot of overhead: remember that every level of recursion is a function call that involves things such as pushing and popping values on stack, handling return addresses etc. The usual advice is therefore to prefer iteration, even though recursion can sometimes be more elegant/simple and if you don't mind the overhead, it's not necessarily wrong to go for it. Typically this is the case when you have multiple branches to dive into, e.g. in case of quicksort. In the above example of factorial we only have one recurring branch, so it's much better to implement the function with iteration:

unsigned int factorial(unsigned int x)
{
  unsigned int result = 1;
  
  while (x > 1)
  {
    result *= x;
    x--;
  }

  return result;
}

How do the computers practically make recursion happen? Basically they use a stack to remember states on each level of the recursion. In programming languages that support recursive function calls this is hidden behind the scenes in the form of call stack. This is why an infinite recursion causes stack overflow.

Another important type of recursion is tail recursion which happens when the recursive call in a function is the very last command. It is utilized in functional languages that use recursion instead of loops. This kind of recursion can be optimized by the compiler into basically the same code a loop would produce, so that e.g. stack won't grow tremendously.


reddit

Reddit

TODO

Typical reddit thread after SJW takeover looks like this:


rgb332

RGB332

RGB332 is a general 256 color palette that encodes one color with 1 byte (i.e. 8 bits): 3 bits (highest) for red, 3 bits for green and 2 bits (lowest) for blue (as human eye is least sensitive to blue). RGB332 is an implicit palette -- it doesn't have to be stored in memory because the color index itself determines the color and vice versa. Compared to the classic 24 bit RGB (which assigns 8 bits to each of the RGB components), RGB332 is very "KISS/suckless" and often good enough (especially with dithering) as it saves memory, avoids headaches with endianness and represents each color with just a single number (as opposed to 3), so it is often used in simple and limited computers such as embedded. It is also in the public domain, unlike some other palettes, so it's additionally a legally safe choice. RGB332 also has a "sister palette" called RGB565 which uses two bytes instead of one and so offers many more colors.

A disadvantage of plain 332 palette lies in the linearity of each component's intensity, i.e. lack of gamma correction, so there are too many almost indistinguishable bright colors while too few darker ones { TODO: does a gamma corrected 332 exist? make it? ~drummyfish }. Another disadvantage is the non-alignment of the blue component with red and green components, i.e. while R/G components have 8 levels of intensity and so step from 0 to 255 by 36.4, the B component only has 4 levels and steps by exactly 85, which makes it impossible to create exact shades of grey (which of course have to have all R, G and B components equal).

The RGB values of the 332 palette are following:

#000000 #000055 #0000aa #0000ff #002400 #002455 #0024aa #0024ff
#004800 #004855 #0048aa #0048ff #006d00 #006d55 #006daa #006dff
#009100 #009155 #0091aa #0091ff #00b600 #00b655 #00b6aa #00b6ff
#00da00 #00da55 #00daaa #00daff #00ff00 #00ff55 #00ffaa #00ffff
#240000 #240055 #2400aa #2400ff #242400 #242455 #2424aa #2424ff
#244800 #244855 #2448aa #2448ff #246d00 #246d55 #246daa #246dff
#249100 #249155 #2491aa #2491ff #24b600 #24b655 #24b6aa #24b6ff
#24da00 #24da55 #24daaa #24daff #24ff00 #24ff55 #24ffaa #24ffff
#480000 #480055 #4800aa #4800ff #482400 #482455 #4824aa #4824ff
#484800 #484855 #4848aa #4848ff #486d00 #486d55 #486daa #486dff
#489100 #489155 #4891aa #4891ff #48b600 #48b655 #48b6aa #48b6ff
#48da00 #48da55 #48daaa #48daff #48ff00 #48ff55 #48ffaa #48ffff
#6d0000 #6d0055 #6d00aa #6d00ff #6d2400 #6d2455 #6d24aa #6d24ff
#6d4800 #6d4855 #6d48aa #6d48ff #6d6d00 #6d6d55 #6d6daa #6d6dff
#6d9100 #6d9155 #6d91aa #6d91ff #6db600 #6db655 #6db6aa #6db6ff
#6dda00 #6dda55 #6ddaaa #6ddaff #6dff00 #6dff55 #6dffaa #6dffff
#910000 #910055 #9100aa #9100ff #912400 #912455 #9124aa #9124ff
#914800 #914855 #9148aa #9148ff #916d00 #916d55 #916daa #916dff
#919100 #919155 #9191aa #9191ff #91b600 #91b655 #91b6aa #91b6ff
#91da00 #91da55 #91daaa #91daff #91ff00 #91ff55 #91ffaa #91ffff
#b60000 #b60055 #b600aa #b600ff #b62400 #b62455 #b624aa #b624ff
#b64800 #b64855 #b648aa #b648ff #b66d00 #b66d55 #b66daa #b66dff
#b69100 #b69155 #b691aa #b691ff #b6b600 #b6b655 #b6b6aa #b6b6ff
#b6da00 #b6da55 #b6daaa #b6daff #b6ff00 #b6ff55 #b6ffaa #b6ffff
#da0000 #da0055 #da00aa #da00ff #da2400 #da2455 #da24aa #da24ff
#da4800 #da4855 #da48aa #da48ff #da6d00 #da6d55 #da6daa #da6dff
#da9100 #da9155 #da91aa #da91ff #dab600 #dab655 #dab6aa #dab6ff
#dada00 #dada55 #dadaaa #dadaff #daff00 #daff55 #daffaa #daffff
#ff0000 #ff0055 #ff00aa #ff00ff #ff2400 #ff2455 #ff24aa #ff24ff
#ff4800 #ff4855 #ff48aa #ff48ff #ff6d00 #ff6d55 #ff6daa #ff6dff
#ff9100 #ff9155 #ff91aa #ff91ff #ffb600 #ffb655 #ffb6aa #ffb6ff
#ffda00 #ffda55 #ffdaaa #ffdaff #ffff00 #ffff55 #ffffaa #ffffff

Operations

Here are C functions for converting RGB332 to RGB24 and back:

unsigned char rgbTo332(unsigned char red, unsigned char green, unsigned char blue)
{
  return ((red / 32) << 5) | ((green / 32) << 2) | (blue / 64);
}

void rgbFrom332(unsigned char colorIndex, unsigned char *red, unsigned char *green, unsigned char *blue)
{
  unsigned char value = (colorIndex >> 5) & 0x07;
  *red = value != 7 ? value * 36 : 255;

  value = (colorIndex >> 2) & 0x07;
  *green = value != 7 ? value * 36 : 255;
  
  value = colorIndex & 0x03;
  *blue = (value != 3) ? value * 72 : 255;
}

Addition/subtraction of two RGB332 colors can be performed by simply adding/subtracting the two color values as long as no over/underflow occurs in either component -- by adding the values we basically perform a parallel addition/subtraction of all three components with only one operation. Unfortunately checking for when exactly such overflow occurs is not easy to do quickly { Or is it? ~drummyfish }, but to rule out e.g. an overflow with addition we may for example check whether the highest bit of each component in both colors to be added is 0 (i.e. if (((color1 & 0x92) | (color2 & 0x92)) == 0) newColor = color1 + color2;). { Code untested. ~drummyfish }

Addition/subtraction of colors can also be approximated in a very fast way using the OR/AND operation instead of arithmetic addition/subtraction -- however this only works sometimes (check visually). For example if you need to quickly brighten/darken all pixels in a 332 image, you can just OR/AND each pixel with these values:

brighten by more:  doesn't really work anymore
brighten by 3:     | 0x6d  (011 011 01)
brighten by 2:     | 0x49  (010 010 01)
brighten by 1:     | 0x24  (001 001 00)
darken by 1:       & 0xdb  (110 110 11)
darken by 2:       & 0xb6  (101 101 10)
darken by 3:       & 0x92  (100 100 10)
darken by more:    doesn't really work anymore

{ TODO: Would it be possible to accurately add two 332 colors by adding all components in parallel using bitwise operators somehow? I briefly tried but the result seemed too complex to be worth it though. ~drummyfish }

Inverting a 332 color is done simply by inverting all bits in the color value.

TODO: blending?

See Also


rgb565

RGB565

RGB565 is a way of representing a total of 65536 colors in just 2 bytes, i.e. 16 bits, by using 5 bits (highest) for red, 6 bits for green (to which human eye is most sensitive) and 5 bits for blue; it can also be seen as a color palette.

TODO


right

Right

See left vs right.


rms

Richard Stallman

{ RMS is a legend and overall a great human, but let's be reminded we shouldn't be creating any heroes or celebrities. ~drummyfish }

The great doctor Richard Matthew Stallman (RMS, also GNU/Stallman and saint IGNUcius, born 1953 in New York) is one of the biggest figures in software history, inventor of free software, founder of the GNU project, Free Software Foundation, a great hacker and the author of a famous text editor Emacs. He is a non-religious Jew and an atheist (though he is the highest saint of Church Of Emacs), a man who firmly stands behind his beliefs and who's been advocating for ethics and user freedom in the computing world. He has also been called the king of software cloning, for he started the wave of making free, ethical clones of proprietary programs.

   _..._
  /   \ \
 (= =  )))
 (.-._ ) ))
 /    \\  \
 ".,,,;,''' 

ASCII art of Richard Stallman

Stallman's life along with free software's history is documented by a free-licensed book named Free as in Freedom: Richard Stallman's Crusade for Free Software on which he collaborated. You can get it gratis e.g. at Project Gutenberg. You should read this!

tl;dr: At 27 as an employee at MIT AI labs Stallman had a bad experience when trying to fix a Xerox printer who's proprietary software source code was made inaccessible; he also started spotting the betrayal of hacker principles by others who decided to write proprietary software -- he realized proprietary software was inherently wrong as it prevented studying, improvement and sharing of software and enable abuse of users. From 1982 he was involved in a "fight" against the Symbolics company that pushed aggressive proprietary software; he was rewriting their software from scratch to allow Lisp Machine users more freedom -- here he proved his superior programming skills as he was keeping up with the whole team of Symbolics programmers. By 1983 his frustration reached its peak and he announced his GNU project on the Usenet -- this was a project to create a completely free as in freedom operating system, an alternative to the proprietary Unix system that would offer its users freedom to use, study, modify and share the whole software, in the hacker spirit. He followed by publishing a manifesto and establishing the Free Software Foundation. GNU and FSF popularized and standardized the term free (as in freedom) software, copyleft and free licensing, mainly with the GPL license. In the 90s GNU adopted the Linux operating system kernel and released a complete version of the GNU operating system -- these are nowadays known mostly as "Linux" distros. As a head of FSF and GNU Stallman more or less stopped programming and started traveling around the world to give talks about free software and has earned his status of one of the most important people in software history.

Regarding software Stallman has for his whole life strongly and tirelessly promoted free software and copyleft and has himself only used free software; he has always practiced what he preched and led the best example of how to live without proprietary software. This is amazing. Nevertheless he isn't too concerned about bloat (judging by the GNU software and his own creation, Emacs) and he also doesn't care that much about free culture (some of his written works prohibit modification and his GNU project allows proprietary non-functional data). Sadly he has also shown signs of being a type A fail personality by writing about some kind of newspeak "gender neutral language" and by seeming to be caught in a fight culture. On his website he also has an American flag and claims to be a patriot, i.e. leaning to nationalism and therefore fascism. Nevertheless he definitely can't be accused of populism or hypocrisy as he basically tells what he considers to be the truth no matter what, and he is very consistent in this. Some of his unpopular opinions brought him a lot of trouble, e.g. the wrath of SJWs in 2019 for his criticism of the pedo witch hunt.

He is a weird guy, having been recorded on video eating dirt from his feet before giving a lecture. In the book Free as in Freedom he admits he might be slightly autistic. Nevertheless he's pretty smart, has magna cum laude degree in physics from Harvard, 10+ honorary doctorates, fluently speaks English, Spanish and French and a little bit of Indonesian and has many times proven his superior programming skills (even though he later stopped programming to fully work on promoting the FSF).

Stallman has a beautifully minimalist website http://www.stallman.org where he actively comments on current news and issues. He also made the famous free software song (well, only the lyrics, the melody is taken from a Bulgarian folk song Sadi Moma).

In 2019 Stallman was cancelled by SJW fascists for merely commenting rationally on the topic of child sexuality following the Epstein scandal. He resigned from the position of president of the FSF but continues to support it.

Stallman has been critical of capitalism though he probably isn't a hardcore anticapitalist (he's an American after all). Wikidata states he's a proponent of alter-globalization (not completely against globalization in certain areas but not supporting the current form of it).

In the book Free As In Freedom it is also mentioned that Stallman had aversion to passwords and secrecy in general -- at MIT he used the username RMS with the same password so that other people could easily log in through his account and access ARPANET (the predecessor of Internet). Indeed, we applaud this.

As anarchists we of course despise the idea of worshiping people, creating heroes and cults of personalities, but the enormous historical significance of Stallman has to be stressed as a plain and simple fact. Even though in our days his name is overshadowed in the mainstream by rich businessman and creators of commercially successful technology and even though we ourselves disagree with Stallman on some points, in the future history may well see Stallman as perhaps the greatest man of the software era, and rightfully so. Stallman isn't a mere creator of a commercially successful software product, he is literally as important as the great philosophers of ancient Greece -- he brilliantly foresaw the course of history and quickly defined ethics needed for the new era of mass available programmable computers, and not only that, he also basically alone established this ethics as a standard IN SPITE of all the world's corporations fighting back. He is also extremely unique in not pursuing self interest, in TRULY living his own philosophy, dedicating his whole life to his cause and refusing to give in even partially. All of this is at much higher level than simply becoming successful and famous within the contemporary capitalist system, his life effort is pure, true and timeless, unlike things achieved by pieces of shit such as Steve Jobs.


rock

Rock

Rocks and stones are natural formations of minerals that can be used to create the most primitive technology. Stone age was the first stage of our civilization; it was characterized by use of stone tools. Rock nerds are called geologists.

Rocks are pretty suckless and LRS because they are simple, they are everywhere, free and can be used in a number of ways such as:

See Also


ronja

Ronja

{ I was informed about this by a friend over email <3 I basically paraphrase here what he told me. See also http://www.modulatedlight.org/. ~drummyfish }

Ronja (reasonable optical near joint access) is a free/open KISS device for wireless connection of two devices using light (i.e. optical communication) and the ethernet protocol; it can be made at home (for about $100), doesn't require any MCUs and as such can be considered a LRS/suckless alternative to traditional WiFi routers that are de-facto owned and controlled by corporations. It works full duplex up to the distance of 1400 meters with a speed of 10 Mbps, which is pretty amazing. One can also imagine Ronja as a kind of ethernet cable, just implemented with light instead of electricity. The design is released under GFDL. The project website is at http://ronja.twibright.com/.

There are many advantages in Ronja -- besides the mentioned KISS design and all its implications (freedom, repairability, cheap price, compatibility, ...), Ronja doesn't use radio so there are no bullshit issues with legal bands etc., it also works with just an ethernet card, offers a stable and constant transmission speed with very low latency, can be potentially harder to block with jammers and to spy on: besides visible light the transmission can also use infrared spectrum and narrow direction of transmission, as opposed to radiating to all directions like wi-fis, also the fast flickering of the LED is unnoticable by human or even normal cameras, therefore Ronja transmission is expensive to detect. Also note that some kind of protocol-level encryption can be used above Ronja, if one so desires. This makes it a nice communication tool for people under oppresive regimes like those in China or USA.

See Also


rsa

RSA

TODO

generating keys:

  1. p := large random prime
  2. q := large random prime
  3. n := p * q
  4. f := (p - 1) * (q - 1) (this step may differ in other versions)
  5. e := 65537 (most common, other constants exist)
  6. d := solve for x: x * e = 1 mod f
  7. public key := (n,e)
  8. private key := d

message encryption:

  1. m := message encoded as a number < n
  2. encrypted := m^e mod n

message decryption:

  1. m := encrypted^d mod n
  2. decrypted := decode message from number m

rule110

Rule 110

Not to be confused with rule 34 xD

Rule 110 is a specific cellular automaton (similar to e.g. Game of Life) which shows a very interesting behavior -- it is one of the simplest Turing complete (computationally most powerful) systems with a balance of stable and chaotic behavior. In other words it is a system in which a very complex and interesting properties emerge from extremely simple rules. The name rule 110 comes from truth table that defines the automaton's behavior.

Rule 110 is one of 256 so called elementary cellular automata which are special kinds of cellular automata that are one dimensional (unlike the mentioned Game Of Life which is two dimensional), in which cells have 1 bit state (1 or 0) and each cell's next state is determined by its current state and the state of its two immediate neighboring cells (left and right). Most of the 256 possible elementary cellular automata are "boring" but rule 110 is special and interesting. Probably the most interesting thing is that rule 110 is Turing complete, i.e. it can in theory compute anything any other computer can, while basically having just 8 rules. 110 (along with its equivalents) is the only elementary automaton for which Turing completeness has been proven.

For rule 110 the following is a table determining the next value of a cell given its current value (center) and the values of its left and right neighbor.

left center right center next
0 0 0 0
0 0 1 1
0 1 0 1
0 1 1 0
1 0 0 1
1 0 1 1
1 1 0 1
1 1 1 0

The rightmost column is where elementary cellular automata differ from each other -- here reading the column from top to bottom we get the binary number 01101110 which is 110 in decimal, hence we call the automaton rule 110. Some automata behave as "flipped" versions of rule 110, e.g. rule 137 (bit inversion of rule 110) and rule 124 (horizontal reflection of rule 110) -- these are in terms of properties equivalent to rule 110.

The following is an output of 32 steps of rule 110 from an initial tape with one cell set to 1. Horizontal dimension represents the tape, vertical dimension represents steps/time (from top to bottom).

#                                                               
##                                                              
###                                                             
# ##                                                            
#####                                                           
#   ##                                                          
##  ###                                                         
### # ##                                                        
# #######                                                       
###     ##                                                      
# ##    ###                                                     
#####   # ##                                                    
#   ##  #####                                                   
##  ### #   ##                                                  
### # ####  ###                                                 
# #####  ## # ##                                                
###   ## ########                                               
# ##  ####      ##                                              
##### #  ##     ###                                             
#   #### ###    # ##                                            
##  #  ### ##   #####                                           
### ## # #####  #   ##                                          
# ########   ## ##  ###                                         
###      ##  ###### # ##                                        
# ##     ### #    #######                                       
#####    # ####   #     ##                                      
#   ##   ###  ##  ##    ###                                     
##  ###  # ## ### ###   # ##                                    
### # ## ###### ### ##  #####                                   
# ########    ### ##### #   ##                                  
###      ##   # ###   ####  ###                                 
# ##     ###  ### ##  #  ## # ##

The output was generated by the following C code.

#include <stdio.h>

#define RULE 110 // 01100111 in binary
#define TAPE_SIZE 64
#define STEPS 32

unsigned char tape[TAPE_SIZE];

int main(void)
{
  // init the tape:
  for (int i = 0; i < TAPE_SIZE; ++i)
    tape[i] = i == 0;
    
  // simulate:
  for (int i = 0; i < STEPS; ++i)
  {
    for (int j = 0; j < TAPE_SIZE; ++j)
      putchar(tape[j] ? '#' : ' ');  
      
    putchar('\n');

    unsigned char state = // three cell state
      (tape[1] << 2) | 
      (tape[0] << 1) |
      tape[TAPE_SIZE - 1];

    for (int j = 0; j < TAPE_SIZE; ++j)
    {
      tape[j] = (RULE >> state) & 0x01;
      state = (tape[(j + 2) % TAPE_SIZE] << 2) | (state >> 1);
    }
  }

  return 0;
}

Discovery of rule 110 is attributed to Stephen Wolfram who introduced elementary cellular automata in 1983 and conjectured Turing completeness of rule 110 in 1986 which was proven by Matthew Cook in 2004.

See Also


rust

Rust

Rust is an extremely poor attempt at a politically motivated capitalist programming language and one of the prime examples of badly designed software in general. It is extremely harmful not just because of its awful design and implementation and motivation, it also promotes toxic politics, tries to replace relatively good languages such as C and, worst of all, is gaining popularity among highly unqualified coding monkeys, i.e. the majority of people creating technology nowadays, so it is infecting everything and contributing to the downfall of technology. FOR THE LOVE OF GOD STAY AS FAR AWAY AS POSSIBLE FROM RUST.

LMAO https://github.com/mTvare6/hello-world.rs

It should be made clear that rust is shit AND CANNOT BE FIXED, it is awful from the ground up and the only way to deal with it is to delete it. To mention just a few issues:


saf

SAF

SAF is a LRS C library for small, very portable games (and possibly other kinds pf programs); it can also be seen as a fantasy console. It was made by drummyfish. The repository is available e.g. at https://codeberg.org/drummyfish/SAF. MicroTD is an example of a LRS game made with SAF, but even more complex games such as Anarch have been ported to it.

The whole SAF library is implemented in a single header file and offers a simple API for programming games, i.e. functions for drawing pixels to the screen, playing sounds, reading buttons, computing the sine function etc., and it handles boring/annoying issues such as the game main loop. It also has built-in a number of frontends that allow compiling a correctly made SAF games to any supported platform among which are SDL, SFML, X11, ncurses (terminal), even open consoles such as Pokitto, Arduboy or Gamebuino META. There is an option to auto-convert a color game to black-and-white only displays. Some PC frontends add emulator-like features such as time manipulation, pixelart upscaling, TAS support and so on.

Games made with SAF run in 64x64 resolution, with 256 colors (332 palette) and 25 FPS. They can use 7 input buttons (arrows, A, B and C), play 4 different sound effects and use 32 bytes of persistent memory, e.g. for savegames or highscores. These relatively low specifications are set on purpose so as to help portability, reduce bloat, frustration and friction. Many times good, entertaining games can be very simple, as is the case e.g. with Tetris, chess, various shmups, roguelikes and so on -- these are the kinds of games SAF is ideal for.

      []  [][][][][]
      [][][]      [][]
      [][]          []
      []    XX    XX[]
      []      XXXX  []
      [][]          []
      [][][]      [][]
      []  [][][][][] 

SAF logo

 microTD
  ________________________________________________________________
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;' ';; ' ' ;;' ';;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;' ';,   ,;,   ,;,   ,;;';;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;,;''   '''   '''   '', ,;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;',''''''''''''''''''',';;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;; ' ,;';;';, ;';'; ;';, ; ;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;; ' ;; ;; ;;   ;   ;  ; ; ;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;; ' ;; ,,;;;   ;   ; ,; ; ;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;,', ''''''    '   ''' ,',;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;,' ' ' ' ' ' ' ' ' ',;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 | ';;;' , ';;;' , ';;;' , ';;;' , ';;;' , ';;;' , ';;;' , ';;;' ,|
 |;, ' ,;;;, ' ,;;;, ' ,;;;, ' ,;;;, ' ,;;;, ' ,;;;, ' ,;;;, ' ,;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |                  ,, ,  ,,,   ,,,          ,,                   |
 |                  ;'';  ;,;   ;,;           ;                   |
 |                  '  '  ' '   '            '''                  |
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;|
 |;';;;;;;'';;;;;;;;;;';;;;;;;;;;;;;;;;''';;;;;;;;;;;;;;;;;;;;;;;;|
 |; ,';;; ',,;;; ,;;;, ,;;;;,;;;;;;;;;;,, ;;;;;;;;;;;;;;;;;;;;;;;;|
 |;,,;;;;;,,;;;,,;;;;;,,;;;;,;;;;;;;;;;,,,;;;;;;;;;;;;;;;;;;;;;;;;|
  ----------------------------------------------------------------

screenshot of a SAF game (uTD) running in terminal with ncurses


sanism

Sanism

Sanism is an absolutely crazy idea made up by the most insane SJWs that says that words like "crazy" and "insane" are "offensive" or even "discriminatory" against mentally ill people and that we should censor such words so as to stay politically correct. Yes, this is pretty fucked up, but give it a year or two and it's gonna become mainstream.

LMAO imagine future news be like "Mentally divergent age fluid human people person of unspecified non-hexadecimal gender and afro american ethno-social-construct was arrested after an incident involving guns and liquor stores. No harm was intended during saying this sentence and we apologize in advance for any mental harm that may have been caused to mentally sensitive people persons by hearing this sentence."


science

Science

Not to be confused with soyence.

Science (from Latin scientia, knowledge or understanding) in a wide sense means systematic gathering, inference and organization of knowledge, in a more strict, "western" sense this process has to be kept rational by obeying some specific strict rules and adhering to whatever principles of objectivity are currently set, nowadays for example the scientific method and peer censorship or mathematical proof. Sciences in the strict sense include mathematics (so called formal science), physics, biology, chemistry, computer science, as well as "soft sciences" such as psychology, sociology etc. Science is not to be confused with pseudoscience (such as gender studies or astrology) and soyence (political propaganda masked as "science", e.g. gender studies, sponsored "science" of big pharma etc.) -- it must be remembered that when science can no longer be questioned, it seizes to be science, as asking questions and examining EVERYTHING are the very basic premises of a true science: this means that anything prohibited to be questioned, by law or otherwise (e.g. by cancel culture), such as the Holocaust (forbidden to be denied in many countries such as Germany), COVID vaccines, racial differences (prohibited on grounds of "hate speech") and similar topics CANNOT be seen as scientifically established, but rather politically established. In the wider sense science may include anything that involves systematic intellectual research, e.g. Buddhists often say their teaching is science rather than religion, that it is searching for objective truths, and it really is true -- a western fedora atheist will shit himself in rage hearing such claim, however that's all he can really do.

TODO: some noice tree of sciences or smth

There is no simple objective definition of a strict science -- the definition of science is hugely arbitrary, political and changes with development of society, technology, culture, changes in government and so on. Science should basically stand for the most rational and objective knowledge we're able to practically obtain about something, however the specific criteria for this are unclear and have to be agreed on. The scientific method is evolving and there are many debates over it, with some even stating that there can be no universal method of science. The p-value used to determine whether measurements are statistically significant has basically just an arbitrarily set value for what's considered a "safe enough" result. Some say that if a research is to be trusted, it has to be peer reviewed, i.e. that what's scientific has to be approved by chosen experts -- this may be not just because people can make mistakes but also because in current highly competitive society there appears science bloat, obscurity and tendencies to push fake research and purposeful deception, i.e. our politics and culture are already defining what science is. However the stricter the criteria for science, the more monopolized, centralized, controlled and censored it becomes.

Science is not almighty as brainwashed internet euphoric kids like to think, that's a completely false idea fed to them by the overlords who abuse "science" (soyence) for control of the masses, as religion was and is still used -- soyence is the new religion nowadays. Yes, (true) science is great, it is an awesome tool, but it is just that -- a tool, usable for SOME tasks, not a silver bullet that could be used for everything. What can be discovered by science is in fact quite limited, exactly because it purposefully LIMITS itself only to accept what CAN be proven and so remains silent about everything else (which however doesn't mean there lies no knowledge or value in the everything else or in other approaches to learning) -- see e.g. Godel's incompleteness theorems that state it is mathematically impossible to really prove validity of mathematics, or the nice compendium of all knowability limitations at http://humanknowledge.net/Thoughts.html. For many (if not most) things we deal in life science is either highly impractical (do you need to fund a peer reviewed research to decide what movie you'll watch today?) or absolutely useless (setting one's meaning of life, establishing one's basic moral axioms, placing completely random bets, deciding to trust or distrust someone while lacking scientifically relevant indicators for either, answering metaphysical questions such as "Why is there ultimately something rather than nothing?" etc.). So don't be Neil de Grass puppet and stop treating science as your omnipotent pimplord, it's just a hammer useful for bashing some specific nails.

What should we accept as "legit" science? We, in the context of our ideal society, argue for NOT creating a strict definition of science, just as we are for example against "formalizing morality" with laws etc. There are no hard lines between good and evil, fun and boring, useful and useless, bloated and minimal, and so also there is no strict line between science and non-science. What is and is not science is to be judged on a case-by-case basis and can be disagreed on without any issue, science cannot be a mass produced stream of papers that can automatically be marked OK or NOT OK. We might define the term less retarded science so as to distinguish today's many times twisted and corrupted "science/soyence" from the real, good and truly useful science. Less retarded science should follow similar principles as our technology, it should be completely free as in freedom, selfless, suckless as much as possible, unobscured etc. -- especially stressed should be the idea of many people being able to reproduce less retarded science; e.g. Newton's law of gravitation is less retarded because it can easily be verified by anyone, while the existence of Higgs boson is not.

Never confuse trusting scientists with trusting science (especially in capitalism and other dystopias), the former is literally faith (soyence), no different from blindly trusting religious preachers and political propaganda, the latter means only trusting that which you yourself can test and verify at home and therefore having real confidence. Also do NOT confuse or equate science with academia. As with everything, under capitalism academia has become rotten to the core, research is motivated by profit and what's produced is mostly utter bullshit shat out by wannabe PhDs who need to mass produce "something" as a part of the crazy academia publish-or-perish game. As with everything in capitalism, the closer you look, the more corruption you find.


sdf

Signed Distance Function

Signed distance function (SDF, also signed distance field) is a function that for any point in space returns its distance to the closest point of some geometric shape, and also the information whether that point is outside or inside that shape (if inside, the distance is negative, outside it's positive and exactly on the surface it is zero -- hence signed distance function). SDFs find use in elegantly representing some surfaces and solving some problems, most notably in computer graphics, e.g. for smoothly rendering upscaled fonts, in raymarching, implementing global illumination, as a 3D model storage representation, for collision detection etc. SDFs can exist anywhere where distances exist, i.e. in 2D, 3D, even non-Euclidean spaces etc. (and also note the distance doesn't always have to be Euclidean distance, it can be anything satisfying the axioms of a distance metric, e.g. taxicab distance).

Sometimes SDF is extended to also return additional information, e.g. the whole vector to the closest surface point (i.e. not only the distance but also a direction towards it) which may be useful for specific algorithms.

What is it all good for? We can for example implement quite fast raytracing-like rendering of environments for which we have a fast SDF. While traditional raytracing has to somehow test each ray for potential intersection against all 3D elements in the scene, which can be slow (and complicated), with SDF we can performs so called raymarching, i.e. iteratively stepping along the ray according to the distance function (which hints us on how big of a step we can make so that we can potentially quickly jump over big empty areas) until we get close enough to a surface which we interpret as an intersection -- if the SDF is fast, this approach may be pretty efficient (Godot implemented this algorithm to render real-time global illumination and reflections even in GPUs that don't support accelerated raytracing). Programs for rendering 3D fractals (such as Mandelbulber) work on this principle as well. SDFs can also be used as a format for representing shapes such as fonts -- there exists a method (called multi-channel SDF) that stores font glyphs in bitmaps of quite low-resolution that can be rendered at arbitrary scale in a quality almost matching that of the traditional vector font representation -- the advantage over the traditional vector format is obviously greater simplicity and better compatibility with GPU hardware that's optimized for storing and handling bitmaps. Furthermore we can trivially increase or decrease weight (boldness) of a font represented by SDFs simply by adjusting the rendering distance threshold. SDFs can also be used for collision detection and many other things. One advantage of using SDFs is their generality -- if we have an SDF raymarching algorithm, we can plug in any shape and environment just by constructing its SDF, while with traditional raytracing we normally have to write many specialized algorithms for detecting intersections of rays with different kinds of shapes, i.e. we have many special cases to handle.

How is an SDF implemented? Well, it's a function, it can be implemented however we wish and need, it depends on each case, but we probably want it to be fast because algorithms that work with SDFs commonly call it often. SDF of simple mathematical shapes (and their possible combinations such as unions, see. e.g. CSG), e.g. spheres, can be implemented very easily (SDF of a sphere = distance to the sphere center minus its radius); even the already mentioned 3D fractals have functions that can be used to quickly estimate the distance towards their surface. Other times -- e.g. where arbitrary shapes may appear -- the function may be precomputed into some kind of N dimensional array, we might say we use a precomputed look up table. This can be done in a number of ways, but as a simple example we can imagine raymarching mirror reflections with which we can subdivide the 3D scene into a grid and into each cell we store the SDF value at its center point (which here may be computed by even a relatively slow algorithm), which will allow for relatively fast search of intersections of rays with the surface (at any point along the ray we may check the SDF value of the current cell which will likely provide information for how big a step we can make next).

. . . . . . . .     3 2 2 2 2 2 2 2
. . . . . . . .     3 2 1 1 1 1 1 1
. . . X X X X X     2 2 1 0 0 0 0 0
. . . X X X X X     2 1 1 0-1-1-1 0
. . X X X X X X     2 1 0 0-1-2-1 0
. . X X X X X X     2 1 0-1-1-2-1 0
. . X X X X X X     2 1 0-1-1-1-1 0
. . X X X X X X     2 1 0 0 0-1-1 0
. . . . X X X X     2 1 1 1 0 0 0 0
. . . . . . . .     2 2 2 1 1 1 1 1

Shape (left) and its SDF (right, distances rounded to integers).

SDFs in computer graphics were being explored a long time ago but seem to have start to become popular since around the year 2000 when Frisken et al used adaptive SDFs as an efficient representation for 3D models preserving fine details. In 2007 Valve published a paper at SIGGRAPH showing the bitmap representation of SDF shapes that they integrated into their Source engine.


security

Security

"As passwords first appeared at the MIT AI Lab I decided to follow my belief that there should be no passwords... I don't believe it's desirable to have security on a computer." -- Richard Stallman (from the book Free As In Freedom)

Computer security (also cybersecurity) is the study of designing computer systems so as to make them hard to "attack" (which usually means accessing "sensitive" information, manipulating it or destabilizing the system itself). At the dawn of computer era security wasn't such a big deal as society didn't depend on computers so much and damage one could cause by exploiting them was limited, however once consumer technology became forced by capitalism and put into EVERYTHING -- companies, governments, streets, homes, clothes, even human bodies and things that can work better without such technology (see e.g. Internet of stinks) -- privacy became another bullshit issue of society as cracking now theoretically allows not only killing individuals but wiping whole countries off the map. Recently security has become a lot concerned with ensuring digital "privacy".

If you want security, the most basic thing to do is to disconnect from the Internet.

Security is in its essence an unnecessary bullshit. It shouldn't exist, the need for more security comes from the fact we live in a shitty dystopia. In a good society there would be no need for security and people could spend their time by solving real problems. We, LRS, advocate NOT for increasing security (which leads to things like police states, censorship, bloat etc.), but for decreasing the need for it, i.e. steering society towards a better direction.


see_through_clothes

See Through Clothes

TODO: tech for seeing through clothes


selflessness

Selflessness

Selflessness means acting with the intent of helping others without harming them, gaining edge over them or taking advantage of them in any way. It is the opposite of self interest. Selflessness is the basis of an ideal society and good technology (while sadly self interest is the basis of our current dystopian capitalist society).

Selflessness is about the intent behind behavior rather than about the result of the behavior; for example being a vegetarian (or even vegan) for ethical reasons (to spare animals of suffering) is selfless while being a vegetarian only because of one's health concerns is not selfless. Similarly if a selfless behavior unpredictably results in harming someone, it is still a selfless behavior as long as the intent behind it was pure. (Note that this does NOT at all advocate the "ends justify the means" philosophy which acts with an intent to hurt someone.)

In the real world absolutely pure selflessness may be very hard to find, partly because such behavior by definition seeks no recognition. Acts of sacrificing one's life for another may a lot of times be seen as selfless, but not always (saving one's child in such way may just serve perpetuating own genes, it can also be done to posthumously increase one's fame etc.). An example of high selflessness may perhaps be so called Langar, a big community kitchen run by Sikhs that prepare and serve free vegetarian food to anyone who comes without differentiating between religious beliefs, skin color, social status, gender etc. Sikhs sometimes also similarly offer a place to stay etc. The mentioned ethical vegetarianism and veganism is another example of selflessness, as well as LRS itself, of course.

Selflessness doesn't mean one seeks no reward, there is practically always at least one reward for a selflessly behaving individual: the good feeling that comes from the selfless action. Selfless acting may also include physical rewards, for example if a programmer dedicates years of his life to developing a free public domain software that will help all people, he himself will get the benefits of using that program. The key thing is that he doesn't use the program to harm others, e.g. by charging money for it or even by using a license that forces others to credit him and so increase his reputation. He sacrificed part of his life purely to increase good in the world for everyone without trying to gain an edge over others.

The latter is important to show that what's many times called selflessness nowadays is only pseudoselflessness, fake selflessness. This includes e.g. all the celebrities who publicly financially support charities; this seems like a nice gesture but it's of course just a PR stunt, the money spent on charities is money invested into promoting oneself, increasing fame, sometimes even tax hacking etc. This also goes for professional firefighters, doctors, FOSS programmers that use licenses with conditions such as attribution etc. This is not saying the behavior of such people is always pure evil, just that it's not really selfless.

Selfless programs and art should be put into the public domain with waivers such as CC0. Using licenses (free or not) that give the programmer some advantage over others (even e.g. attribution) are not selfless.

See Also


semiconductor

Semiconductor

{ For a physicist there's probably quite a lot of simplification, this is written from the limited point of view of a programmer. ~drummyfish }

Semiconductors are materials whose electrical conductivity varies greatly with conditions such as temperature, illumination or their purity, unlike insulators who generally don't conduct electricity very well (have a great resistivity) and conductors who do. Semiconductors, especially silicon (Si), are the key component of digital electronic computers and integrated circuits. Other semiconductors include germanium, selenium or compound ones (composed of multiple elements).

Semiconductors are important for computers because they help implement the binary logic circuits, they can behave like a switch that is either on (1) or off (0). Besides that they can serve e.g. for making measurements (a component whose resistivity depends on its illumination can be used to measure amount of light by measuring the resistivity). Especially important electronic components based on semiconductors are the diode (lets current flow only one way) and transistor (a purely electrical "switch" that can be made extremely tiny).

Normally semiconductors don't conduct so well at room temperature (they conduct better at higher temperatures, unlike metals), but their conductivity can be increased by so called doping -- introducing small impurities of other elements. By doing this we can get two types of semiconductors:

If we connect a P and N type semiconductors, we get so called PN junction which only conducts current one way and is used in diodes and transistors. After putting P and N materials together, at the boundary some electrons from the N type material fill the holes in the P type material which creates a small depletion region of certain width. This region is an electric field that's negative on the P side and positive on the N side (because negative electrons have moved from N to P). If we connect the PN junction to a voltage source, with P side to the positive and N side to the negative terminal, we create an opposite electric field which will eliminate the depletion region and allow the flow of current. Connecting the sides the other way around will result in increasing the width of the depletion region and blocking the current flow.


settled

Settled

When a software is so good it makes you stop searching further for any other software in the same category, it makes the debate settled -- you've found the ultimate tool, you're now free, you no longer have to think about the topic or invest any more precious time into trying different alternatives. You've found the ultimate, nearly perfect tool for the job, the one that makes the others obsolete. Settling the competition of different tools is one of the goal of good technology as it frees users as well as programmers.

For example Vim often settles the search for a text and programming editor for many programmers.

Nevertheless some soyboys just like hopping and switching their tools just because, they simply like wasting their time with things like distrohopping etc. There's no help to these people.

{ software that made the debate settled for me: vim, dwm, st, badwolf ~drummyfish }


shader

Shader

Shader is a program running on the graphics processing unit (GPU), typically in many parallel instances as to utilize the GPU's highly parallel nature. As such they are simple to mid complexity programs.

The word shader is also used more loosely to stand for any specific effect, material or look in 3D graphics (e.g. games), as shaders are usually the means of achieving such effects.

Shaders are normally written in a special shading language such as GLSL in the OpenGL API, HLSL (proprietary) in Direct3D API or the Metal shading language (proprietary) in Metal API. These languages are often similar to C with some additions (e.g. vector and matrix data types) and simplifications (e.g. no function recursion). High level frameworks like Blender many times offer visual programming (point-n-click) of shaders with graph/node editors.

Initially (back in the 90s and early 2000s) shaders were used only for graphics, i.e. to transform 3D vertices, draw triangles and compute pixel colors. Later on as GPUs became more general purpose, flexibility was added to shaders that allowed to solve more problems with the GPU and eventually general compute shaders appeared (OpenGL added them in version 3.3 in 2010).

To put shaders in the context, the flow of data is this: a CPU uploads some data (3D models, textures, ...) to the GPU and then issues a draw command -- this makes the GPU start its pipeline consisting of different stages, e.g. the vertices of 3D models are transformed to screens space (the vertex stage), then triangles are generated and rasterized (the shading stage) and the data is output (on screen, to a buffer etc.). Some of these stages are programmable and so they have their own type of a shader. The details of the pipeline differ from API to API, but in general, depending on the type of data the shader processes (the stage), we talk about:


shit

Shit

Shit is something that's awfully bad.

Unicode for pile of shit is U+1F4A9.

List of Things That Are Shit

See Also


shogi

Shogi

Shogi, also called Japanese chess, is an old Asian board game, very similar to chess, and is greatly popular in Japan, even a bit more than go, the second biggest Japanese board game. Shogi is yet more complex (and bloated) than chess, has a bigger board, more pieces and more complex rules that besides others allow pieces to come back to play; for a chess player shogi is not that hard to get into as the basic rules are still very similar, and it may offer a new challenge and experience. Also similarly to chess, go, backgammon and similar board games, LRS sees shogi as one of the best games ever as it is legally not owned by anyone (it is public domain), is relatively simple, cheap and doesn't even require a computer to be played. The culture of shogi is also different from that of chess, there are many rituals connected to how the game is conducted, there are multiple champion titles, it is not common to offer draws etc.

{ Lol apparently (seen in a YT video) when in the opening one exchanges bishops, it is considered rude to promote the bishop that takes, as it makes no difference because he will be immediately taken anyway. So ALWAYS DO THIS to piss off your opponent and increase your chance of winning :D ~drummyfish }

Quick sum up for chess players: Games are longer. When you get back to chess from shogi your ELO will bump 100 points as it feels so much easier. Pawns are very different (simpler) from chess, they don't take sideways so forget all you know about pawn structure (prepare for bashing your head thinking a pawn guards something, then opponent takes it and you realize you can't retake :D just write gg and start a new game). The drop move will fuck up your brain initially, you have to start considering that opponent can just smash his general literally in front of your king and mate you right there { still fucking happens to me all the time lol :D ~drummyfish }. Exchanges and sacrifices also aren't that simple as any piece you sacrifice YOU GIVE TO THE OPPONENT, so you better not fuck up the final attack on the king or else the opponent just collects a bunch of your pieces and starts his own attack right in your base by dropping those pieces on your king right from the sky. You have to kill swiftly and precisely, it can turn over in an instant. There is no castling (but king safety is still important so you castle manually). Stalemate is a loss (not a draw) but it basically never happens, Japanese hate draws, draws are rare in shogi.

The game's disadvantage and a barrier for entry, especially for westeners, is that the traditional design of the shogi pieces sucks big time, for they are just same-colored pieces of wood with Chinese characters written on them which are unintelligible to anyone non-Chinese and even to Chinese this is greatly visually unclear -- all pieces just look the same on first sight and the pieces of both player are distinguished just by their rotation, not color (color is only used in amateur sets to distinguish normal and promoted pieces). But of course you may use different, visually better pieces, which is also an option in many shogi programs -- a popular choice nowadays are so called international pieces that show both the Chinese character along with a simple, easily distinguishable piece symbol. There are also sets for children/beginners that have on them visually indicated how the piece moves.

Rules

As with every game, rules may slightly differ here and there, but generally they are in principle similar to those of chess, with some differences and with different pieces. The goal of the game is to deliver a checkmate to the opponent's king, i.e. make him unable to escape capture (same as in chess). The details are as follows.

Shogi is played on a 9x9 rectangular board: the squares are not square in shape but slightly rectangular and they all have the same color. There are two players: sente (plays first) and gote (plays second); sente is also sometimes called black and gote white, like in chess (though unlike in chess black starts first here), but the pieces actually all have the same color (as they can be exchanged).

The pieces are weird pentagonal arrow-like shapes that have on them written a Chinese character identifying the piece; on the other side there is a symbol representing the promoted version of the piece (i.e. if you promote, you turn the piece over). The arrow of the piece is turned against the enemy and this is how it is distinguished which player a piece belongs to.

The table showing all the types of pieces follows. The movement rules are same as in chess, i.e. pieces cannot jump over other pieces except for the knight. (F, R, B, L mean forward, right, bottom, left.)

piece symbol letter ~value move rules comment
pawn P 1 1 F also takes forward (not complicated like in chess)
lance L 4 F (any distance) can't go backwards or sideways, just forward!
knight N 5 2 F., then 1 L or R similar to knight in chess, only one that jumps over pieces
silver general S 7 1F1L, 1F, 1F1R, 1B1L, 1B1R like king but can't go directly back, left or right
gold general G 8 1F1L, 1F, 1F1R, 1L, 1R, 1B similar to silver but has 6 squares (s. only has 5), can't promote
bishop B 11 diagonal (any distance) same as bishop in chess
rook R 13 horiz./vert. (any distance) same as rook in chess
promoted pawn +P 10 like gold general more valuable than gold because when captured, enemy only gets pawn
promoted lance +L 9 like gold general
promoted knight +N 9 like gold general
promoted silver +S 9 like gold general
p. bishop +B 15 like both king and bishop can now move to other set of diagonals!
p. rook (dragon) +R 17 like both king and rook
king K inf any neighboring 8 squares same as king in chess, can't promote

At the beginning the board is set up like this:

  9 8 7 6 5 4 3 2 1
  _________________
 |L N S G K G S N L| a   gote      promotion
 |. R . . . . . B .| b  (white)      zone
 |P P P P P P P P P| c ----------------------
 |. . . . . . . . .| d
 |. . . . . . . . .| e
 |. . . . . . . . .| f
 |p p p p p p p p p| g ----------------------   
 |. b . . . . . r .| h   sente     promotion
 |l n s g k g s n l| i  (black)      zone
  """""""""""""""""

So called furigoma is used to decide who starts (has the black pieces): one player throws 5 pawn pieces, if the number of unpromoted pawns ending up facing up is higher than the number of promoted ones, the player who tossed starts.

Then the players take turns in making moves, one can either:

If a piece is immediately endangering the enemy king (so that it could capture it the next turn), a check happens. The player in check has to immediately avoid it, i.e. make a move that makes his king not be endangered by any enemy piece. If he cannot do that, he got checkmated and lost.

TODO

Playing Tips

TODO

See Also


shortcut_thinking

Shortcut Thinking

Shortcut thinking means making conclusions by established associations (such as "theft = bad") rather than making the extra effort of inferring the conclusion from known, possibly updated facts. This isn't bad in itself, in fact it is a great and necessary optimization of our thinking process and it's really why we have long term memory -- imagine we'd have to deduce all the facts from scratch each time we think about anything. However shortcut thinking can be a weakness in many situations and leaves people prone to manipulation by propaganda. As such this phenomenon is extremely abused by politicians, i.e. they for example try to shift the meaning of a certain negative word to include something they want to get rid of.

Some commonly held associations appearing in shortcut thinking of common people nowadays are for example "theft = piracy = bad", "laziness = bad", "pedophiles = child rapists = bad", "competition = good", "more jobs = good", "more complex technology = better", etc. Of those most are of course either extremely simplified or just plain wrong. however some association may of course be correct, such as "murder = bad".

Let's focus on the specific example of the association "theft = bad". Indeed it has some sense in it -- if we turn shortcut thinking off, we may analyze why this association exists. For most of our history the word theft has meant taking a physical personal possession of someone else against their will. Indeed, in a society of people of which most weren't rich, this was bad in most cases as it hurt the robbed person, he has lost something he probably needed. However the society evolved, the meaning of property itself has changed from "personal property" to "private property", i.e. suddenly there were people who could own a whole forest or a factory even if they have never seen it, and there were people who had much more than they needed. If a poor starving person steals food from the rich who doesn't even notice this, suddenly the situation is different and many will say this is no longer bad. Nevertheless the word theft stayed in use and now included even such cases that were ethical because of the shifted meaning of the word "property" and due to changes in conditions of people. Recently the word property was shifted to the extreme with the invention of intellectual property, i.e. the concept of being able to own information such as ideas or stories in books. Intellectual property is fundamentally different from physical property as it can't be stolen in the same way, it can only be copied, duplicated, but this copying doesn't rid the "owner" of the original information. And so nowadays the word "theft", or one of its modern forms, "piracy", includes also mere copying of information or even just reusing an idea (patent) for completely good purposes, for example writing computer programs in certain (patented) ways is considered a theft. Of course, some may argue that such a download or reuse prevents the "owner's" profit from selling copies of that information or licenses to that idea, however it must be known that again, society is completely different nowadays and this so called "theft" actually doesn't hurt anyone but some gigantic billion dollar corporation that doesn't even notice, no actual person gets hurt, only a legal entity, and these so called "theft" actually give rise to good, helpful things. In fact, hurting a corporation, by definition a fascist entity hostile to people, may further be seen as a good thing, so stealing from corporation is also good by this view. Furthermore the illusion of profit theft here is arbitrarily made, the "theft" exists only because we've purposefully created a system which allows selling copies of information and restricting ideas and therefore enables this "theft", i.e. this is no longer a natural thing, this is something miles away from the original meaning of the word "theft". With all this in mind we may, in today's context of the new meaning of old words, reconsider theft to no longer be generally bad.

When confronted with a new view, political theory etc., we should try to turn shortcut thinking off. Doing this can be called being open minded, i.e. opening one's mind to reinterpretation of very basic concepts by a new view. Also we should probably update our association from time to time just to keep them up with the new state of the word.

The politics and views of LRS requires extreme open mindedness to be accepted by someone indoctrinated by the standard capitalist fascist propaganda of today.


sigbovik

SIGBOVIK

SIGBOVIK (special interest group on Harry Q. Bovik) is a computer science conference running since 2007 that focuses on researching and presenting fun ideas in fun ways, scientifically but in a lighthearted hacker spirit similar to e.g. esoteric programming languages research or the IOCCC. SIGBOVIK has their own proceedings just like other scientific conferences, the contributors are usually professional researchers and experts in computer science. The name seems to be a reference to the "serious" conferences such as SIGGRAPH, SIGMOD etc. (SIGBOVIK is organized by the Association for Computational Heresy while the "serious" SIGs are run by Asscoiation for Computing Machinery, ACM).

A famous contributor to the conference is e.g. Tom7, a PhD who makes absolutely lovely youtube videos about his fun research (e.g. this one is excellent https://www.youtube.com/watch?v=DpXy041BIlA).

{ Skimming through the proceedings sadly most of the stuff seems rather silly, though there are a few good papers, usually those by Tom7. Maybe I'm just dumb. ~drummyfish }


sin

Sine

Sine, abbreviated sin, is a trigonometric function that simply said models a smooth oscillation, it is one of the most important and basic functions in geometry, mathematics and physics, and of course in programming. Along with cosine, tangent and cotangent it belongs to a group of functions that can be defined by ratios of sides of a right triangle depending on one of the angles in it (hence trigonometric -- "triangle measuring"). If some measurement looks like sine function, we say it is harmonic. This is very common in nature and technology, e.g. a weight on a spring goes up and down by this function, alternating current voltage has the sine shape (because it is generated by a circular motion) etc.

The function is most commonly defined using a right triangle as follows. Consider the following triangle:

          /|
         / |
        /  |
      c/   |
      /    |a
     /     |
    /     _|
   /A____|_|
       b

Sin(A), where A is the angle between side b and c, is the ratio a / c. The function can be defined in many other ways, for example it is the curve we get when tracking only one direction (e.g. horizontal) of a point moving alongside circle. It can also be defined as a solution to some differential equations etc.

The graph of the sine function is following:

                  ^ sin(x)
                  |
                1_|_
                  |     .--'''--.
      -1/2 pi     | _.''         ''._        3/2 pi
.________|________.'________|________'|________|________.' --> x          
 '._     |     _.'|0        |         |'._     |     _.'|     
    ''--___--''  _|_      1/2 pi      pi  ''--___--''  2 pi
               -1 |

Why the fuck are there these pi values on the x line??? Nubs often can't comprehend this. These pi values are values in radians, units of measuring angles where 2 pi is the full angle (360 degrees). In fact sine is sometimes shown with degrees instead of radians (so imagine 90 degrees on the line where there is 1/2 pi etc.), but mathematicians prefer radians. But why are there angles in the first place??? Why doesn't it go e.g. from 0 to 1 like all other nice functions? Well, it's because of the relation to geometry, remember the fucking triangle above... also if you define sine with a circle it all repeats after 2 pi. Just draw some picture if you don't get it.

Some additional facts and properties regarding the sine functions are:

Some values of the sine function are:

x (rad) x (deg) sin(x)
0 0 0
pi / 12 15 ~0.259
pi / 6 30 0.5
pi / 4 45 sqrt(2)/2 ~= 0.707
pi / 3 60 sqrt(3)/2 ~= 0.866
pi / 2 90 1
2 pi 360 0

Programming

In programming languages the sine function is generally available in some math library, for example in C the function sin is in math.h. Spare yourself bugs, always check if your sin function expects radians or degrees!

Want to make your own sine function for whatever reason (performance, curiosity, ...)? Then firstly consider what you expect from it. If you want a small, fast and perhaps integer only sin function (the one we'd prefer in LRS) that doesn't need extreme accuracy, consider using a look up table. You simply precompute the values of the sine function into a static table in memory and the function just retrieves them when called -- this is super fast. Note that you can save a lot of space by only storing sine values between 0 and 1/2 pi, the remaining parts of the function are just different transformations of this part. You can further save space and/or make the function work with floats by further interpolating (even just linearly) between the stored values, for example if sin(3.45) is called and you only have values stored for sin(3.4) and sin(3.5), you simply average them.

Lot of times, e.g. in many calculators where speed isn't really critical, sine is computed using Taylor series -- a sum of infinitely many terms of which if we take the first N, we get an approximation of the function (the more terms we add, the more precise we get). For sine the series is

sin(x) = x - x^3 / 3! + x^5 / 5! - x^7 / 7! + ...

Adding just the first 3 terms (x - x^3 / 6 + x^5 / 120) already gives a very accurate approximation in range <-pi/2,pi/2> (error < 0.5 %). Here is a C function that uses this to compute an 8bit sine (the magic numbers are made so as to incorporate pi while using power of two divisors, also note the use of many operations that will make the function relatively slow):

// x = 255 means full angle, returns 0 to 255
unsigned char sin8(unsigned char x)
{
  int a = x;
  char flip = 0;

  if (a > 127)
  {
    a -= 128;
    flip = 1;
  }
  
  if (a > 63)
    a = 128 - a;

  int result = (411999 * a) - (a * a * a * 41);

  a /= 4;

  a = a * a * a * a * a;

  result = (a + result) / 131072;
  return flip ? (127 - result) : (127 + result);
}

If you just need a super fast and very rough sine-like value, there exists an ugly engineering approximation of sine that can be useful sometimes, it says that

sin(x) = x, for small x

Indeed, sine looks similar to a mere line near 0, but you can see it quickly diverges.

Very rough and fast approximations e.g. for primitive music synthesis can be done with the traditional very basic square or triangle functions. The following is a simple 8bit linear approximation that's more accurate than square or triangle (approximates sine with a linear function in each octant):

unsigned char sinA(unsigned char x)
{
  unsigned char quadrant = x / 64;

  x %= 64;

  if (quadrant % 2 == 1)
    x = 63 - x;

  x = x < 32 ? (2 * x + x) : (64 + x);

  return quadrant <= 1 ? (128 + x) : (127 - x);
}

Similar approximation can be made with a quadratic curve, the following is a modification of the above function that does this (notice that now we need at least 16 bits for the computation so the data type changed to int): { I quickly made this just now, maybe it can be improved. ~drummyfish }

int sinA(int x)
{
  unsigned char quadrant = x / 64;

  x %= 64;

  if (quadrant % 2 == 1)
    x = 63 - x;

  x -= 63;
  x = (x * x) / 32;

  return quadrant <= 1 ? (255 - x) : x;
}

Furthermore there exist other nice approximations, such as the extremely accurate Bhaskara I's approximation (angle in radians): sin(x) ~= (16 * x * (pi - x)) / (5 * pi^2 - 4 * x * (pi - x)). (This formula is actually more elegant for cosine, so it may be even better to consider using that.) Here is a C fixed point implementation:

#define UNIT 1024
#define PI ((int) (UNIT * 3.14159265))

/* Integer sine using Bhaskara's approx. Returns a number
in <-UNIT, UNIT> interval. Argument is in radians * UNIT. */

int sinInt(int x)  
{
  int sign = 1;
    
  if (x < 0) // odd function
  {
    x *= -1;
    sign = -1;
  }
    
  x %= 2 * PI;
  
  if (x > PI)
  {
    x -= PI;
    sign *= -1;
  }
    
  x *= PI - x;
  
  return sign * (16 * x) / ((5 * PI * PI - 4 * x) / UNIT);
}

sjw

Social Justice Warrior

Social justice warrior (SJW) is an especially active, toxic and aggressive kind of pseudoleftist (a kind of fascist) that tries to fight (especially on the Internet) anyone opposing or even just slightly criticizing the mainstream pseudoleftist gospel such as the feminism and LGBT propaganda. SJWs divide people rather than unite them, they operate on the basis of hate, revenge and mass hysteria and as we know, hate spawns more hate, they fuel a war mentality in society. They support hard censorship (forced political correctness) and bullying of their opposition, so called cancelling, and also such retardism as sanism and whatnot. Wokeism is yet more extreme form of SJWery.

SJWs say the term is pejorative. We say it's not pejorative enough xD


slowly_boiling_the_frog

Slowly Boiling The Frog

Slowly boiling the frog is a phrase said to communicate the idea that people will tolerate a great change for the worse if that change is very gradual, even if they would absolutely not tolerate this change being made quickly. It refers to an experiment in which a frog doesn't jump out of boiling water if the water temperature is raised very gradually (even though according to "modern science" this experiment isn't real).

For example the amount and aggressiveness of brainwashing ads and technological abuse that young people tolerate nowadays would have been absolutely unacceptable a few decades ago, but now it's the reality of life that few even question (some complete retards like that linus tech faggot even defend it).

The technique of slowly boiling the frog is used by corporations, governments, fascists and idiots to slowly take away people's freedom in small steps: each step takes away a bit of freedom while promising some reward, normally in form of additional comfort -- normal people are too braindead to see the obvious trick and are enthusiastic about the change. If you tell them that giving up net neutrality or P2P encryption will eventually lead to almost complete loss of freedom, they label you a tinfoil or "conspiracy theorist", they tell you that "it's not a big deal". So it will go on with other and other changes and the normie is still happy because he can only see one step ahead or behind. The bad thing is that it's not only the normie who will suffer --in fact he may even be happy as a slave robot of the system -- but you will suffer as well. Normies decide the future of the environment we all have to live in.

Slowly boiling the frog works very well when spanning several generations because a new generation won't remember that things used to be better. Parents can tell them but young never listen to older generations, or take them seriously. A zooomer won't remember that computers used to be better, he thinks that bloated phones filled with ads and DRM that don't work without Internet connection and that spy on you constantly are the only way of technology.


small3dlib

Small3dlib

Small3dlib (S3L) is a very portable LRS/suckless single header 3D software renderer library written by drummyfish in the C programming language. It is very efficient and runs on many resource-limited computers such as embedded open consoles. It is similar to TinyGL, but yet more simple. Small3dlib is public domain free software under CC0.

The repository is available at https://codeberg.org/drummyfish/small3dlib and https://gitlab.com/drummyfish/small3dlib.

Small3dlib can be used for rendering 3D graphics on almost any device as it is written in pure C99 without any software and hardware dependencies; it doesn't use the standard library, floating point or GPU. It is also very flexible, not forcing any preprogrammed shaders -- instead it only computes which pixels should be rasterized and lets the programmer of the main application decide himself what should be done with these pixels (this is typically applying some shading and writing them to screen).

Some of the rendering features include:

                                                            ##x
                                                         ####xx
                                                      ######xxxx
                ..xx                               ########xxxxx
             .....xxx                           ##########xxxxxx
          .......xxxxx                       #############xxxxxxx
          .......xxxxxxx                    #############xxxxxxxx
          .......xxxxxxxx                  #############xxxxxxxxxx
         .......xxxxxxxxxx                #############xxxxxxxxxxx
         .......xxxxxxxxxxx              #############xxxxxxxxxxxx
        ........xxxxxxxxxxxxx           ##############xxxxxxxxxxxxx
        .......xxxxxxxxxxxxxxx         ##############xxxxxxxxxxxxxx
        .......xxxxxxxxxxxxxxxx       ##############xxxxxxxxxxxxxxxx
       ........xxxxxxxxxxxxxxxxx     ##############xxxxxxxxxxxxxxxxx
       .......xxxxxxxxxxxxxxxxxx    ##############xxxxxxxxxxxxxxxxxx
      ........xxxxxxxxxxxxxxxxxx   ................xxxxxxxxxxxxxxxxxx
      ........xxxxxxxxxxxxxxxxxx    ................xxxxxxxxxxxxxxxxx
      .......xxxxxxxxxxxxxxxxxx      ...............xxxxxxxxxxxxxxxxx
     ........xxxxxxxxxxxxxxxxxx      ................xxxxxxxxxxxxxxxx
     ........xxxxxxxxxxxxxxxxxx       ...............xxxxxxxxxxxxxxx
    ........xxxxxxxxxxxxxxxxxxx        ...............xxxxxxxxxxxxxx
    ........xxxxxxxxxxxxxxxxxx         ................xxxxxxxxxxxx
    ........xxxxxxxxxxxxxxxxxx          ...............xxxxxxxxxxxx
   ........xxxxxxxxxxxxxxxxxxx           ...............xxxxxxxxxx
   ........xxxxxxxxxxxxxxxxxxx           ...............xxxxxxxxxx
  .........xxxxxxxxxxxxxxxxxx             ...............xxxxxxxx
         #xxxxxxxxxxxxxxxxxxx              ..............xxxxxxxx
               #xxxxxxxxxxxxx              ...............xxxxxx
                      xxxxxxx                 .............xxxxx
                                                 ..........xxxx
                                                    ........xxx
                                                       .....xx
                                                          ...x

Simple ASCII rendering made with small3dlib.


smallchesslib

Smallchesslib

TODO

The repository is available at https://codeberg.org/drummyfish/smallchesslib.


smart

Smart

Smart, smells like fart.

The adjective "smart", as in e.g. smartphone, is in the context of modern capitalist technology used as a euphemism for malicious features that include spyware, bloat, obscurity, DRM, ads, programmed planned obsolescence, unnecessary dependencies (such as required Internet connection), anti-repair design and others. "Smart" technology is far inferior to the traditional "dumb" technology and usually just downright harmful to its users and society as a whole, but normal (i.e. retarded) people think it's good because it has a cool name, so they buy and support such technology. They are slowly boiled to accept "smart" technology as the standard.


smol_internet

Smol Internet

Smol Internet, smol web, small web, smol net, dork web, poor man's web, web revival, web 1.0 and similar terms refer to Internet technology (such as gopher, gemini, plain HTML etc.) and communities that are smaller, simpler, less controlled and less toxic than the "big mainstream Internet" (especially the Web) which due to capitalism became littered with shit like ads, unbearable bloat, censorhip, spyware, corporate propaganda, masses of retarded people, bullshit cheap visuals like animations etc. Consider this analogy: the mainstream, i.e. world wide web, Discord, Facebook etc., is like a big shiny city, but as the city grows and becomes too stressful, overcrowded, hasted, overcontrolled with police and ads on every corner, people start to move to the countryside where life is simpler and happier -- smol Internet is the countryside.

What EXACTLY constitutes a Smol Internet? Of course we don't really have exact definitions besides what people write on blogs, it also depends on the exact term we use, e.g. smol web may refer specifically to lightweight self-hosted websites while smol net will also include different protocols than HTTP(s) (i.e. things outside the Web). But we are usually talking about simpler (KISS, suckless, ...), alternative, decentralized, self hosted technology (protocols, servers, ...), and communities that strive to escape commercially spoiled spaces. These communities don't aim to grow to big sizes or compete with the mainstream Web, they do not seek to replace the Web or achieve the same things (popularity, profitability, ...) but rather bring back the quality the Web (and similar services such as Usenet) used to have in the early days such as relative freedom, unrestricted sharing, free speech, simplicity, decentralization, creative personal sites, comfiness, fun and so on. It is for the people, not for companies and corporations. Smol Internet usually refers to gopher and gemini, the alternative protocols to HTTP, the basis of the Web. Smol Web on the other hand stands for simple, plain HTML web 1.0 static personal/community sites on the Web itself which are however hosted independently, often on one's own server (self hosted) -- such sites can be searched e.g. with the wiby search engine. It may also include small communities such as pubnixes like SDF and tildeverse. Other KISS communication technology such as email and IRC may also fall under Smol Internet.


social_inertia

Social Inertia

Social inertia appears when a social group continues to behave in an established way mainly because it has behaved that way for a long time, even if such behavior is no longer justified.


software

Software

The article you're looking for is here.


sorting

Sorting

Sorting means rearranging a sequence, such as a list of numbers, so that the elements are put in a specific order (e.g. ascending or descending). In computer science sorting is quite a wide topic, there are dozens of sorting algorithms, each with pros and cons and different attributes are being studied, e.g. the algorithm's time complexity, its stability etc. Sorting algorithms are a favorite subject of programming classes as they provide a good exercise for programming and analysis of algorithms and can be nicely put on tests :)

Some famous sorting algorithms include bubble sort (a simple KISS algorithm), quick and merge sort (some of the fastest algorithms) and stupid sort (just tries different permutations until it hits the jackpot).

In practice however we oftentimes end up using some of the simplest sorting algorithms (such as bubble sort) anyway, unless we're programming a database or otherwise dealing with enormous amounts of data. If we need to sort just a few hundred of items and/or the sorting doesn't occur very often, a simple algorithm does the job well, sometimes even faster due to a potential initial overhead of a very complex algorithm. So always consider the KISS approach first.

Attributes of sorting algorithms we're generally interested in are the following:

In practice not only the algorithm but also its implementation matters. For example if we have a sequence of very large data structures to sort, we may want to avoid physically rearranging these structures in memory, this could be slow. In such case we may want to use indirect sorting: we create an additional list whose elements are indices to the main sequence, and we only sort this list of indices.

List Of Sorting Algorithms

TODO


soydev

Soydev

Sodevs are incompetent wanna-be programmers that usually have these characteristics:

Here is a quick rough comparison of seydevs and actual good programmers (nowadays mostly an extinct species):

characteristic good programmer soydev
math skills deep knowledge of math "I don't need it", "there's library for that", memorized math interview questions
computer knowledge all-level, big-picture knowledge of principles knowledge of trivia ("This checkbox in this framework has to be unchecked.", ...)
specialization generalist hyperspecialized, knows one language/framework
prog. languages C, assembly, FORTRAN, Forth, comun, lisp, ... Python, JavaScript, Rust, Java, C#, C++2045, ...
mostly does thiking about algorithms and data structures typing glue code for different libraries, updates/maintains systems, talks to people
political opinions politically incorrect hippie anarcho pacifist liberal capitalist feminist pro black lesbian LGBT fascist anti Nazi
hardware 640x480 1990s laptop, no mouse 2023 touchscreen 1080K macbook with stickers all over, wireless $1000 AI gaming mouse
memorized knowledge 10000 digits of pi 10000 genders plus offensive words he mustn't say
text editor vim, ed, ... Microsoft AI blockchain VSCode with 10000 plugins running in 10000 virtual sandboxes
looks fat, unwashed, unkept beard, dirty clothes pink hair, fake glasses, $1000 T-shirt "sudo make sandwich HAHA BAZINGA", 10000 tattoos
gender male depends on mood
race white prefers not to specify
hobbies reading encyclopedias, chess, rocket science distrohopping, browserhopping, githopping, editorhopping, tiktok, partying

soyence

Soyence

Not to be confused with science.

Soyence is business, propaganda and politics trying to pass as science, nowadays promoted typically by pseudoleftists, pseudoskeptics, capitalists and corporations. It is what in the 21st century has taken on the role that's historically been played by the church: that of establishing and maintaining orthodoxy and so controlling the masses -- this time so called "science" is used as a tool for achieving this instead of God, but the results are the say. Soyence is not about listening to what science says, it is about listetning to what "scientists" say, and of course not questioning them; soyence is what the typical reddit atheist or tiktok feminist believes science is or what Neil De Grass Tyson tells you science is. Soyence calls itself the one and only science^TM and gatekeeps the term science by calling unpopular science (such as that regarding human race or questioning big pharma vaccines) "pseudoscience" and "conspiracy theories". Soyence itself is pseudoscience but it has an official status, approval of state, strong connection to politics, it is mainstream, popular, controlled by those in power, censored ("moderated") and intentionally misleading. Soyence can be encountered in much of academia, on Wikipedia and in other popular/mainstream media such as TV "documentaries" and YouTube.

Compared to good old fun pseudosciences such as astrology and flat Earth, soyence is extra sneaky by purposefully trying to blend in with real science, i.e. within a certain truly scientific field, such as biology, there is a soyentific cancer mixed in by activists, corporations and state, that may be hard to separate for common folk and many times even for pros. This is extremely harmful as in the eyes of retarded people (basically everyone) the neighboring legit science gives credibility to propaganda bullshit. There is a tendency to think we somehow magically live in a time that's fundamentally different from other times in history in which it is now a pretty clear and uncontroversial fact that the name of science was abused hard by propaganda, almost everyone easily accepts that historically politically constructed lies were presented as confirmed by science, but somehow people refuse to believe it could be the case nowadays. In times of Nazism there was no doubt about race being a completely scientific term and that Jews were scientifically confirmed to be the inferior race -- nowadays in times when anti Nazis have won and politics is based on denying existence of race somehow scientists start to magically find evidence that no such thing as race has ever existed -- how convenient! And just in case you wanted to check if it's actually true, you'll be labeled a racist and you won't find job ever again.

Soyence uses all the cheap tricks of politics (also not dissimilar to those of greenwashing, openwashing etc.) to win stupid people, it builds on the cult of bullying religion and creating a war mentality, overuse of twisted "rationality" (pseudoskepticism), creating science bloat and bullshit "scientific" fields to obscure lies, punishment of the correct use of rationality, building cults of personality ("science educators", the gatekeepers of "science") and appealing to egoism and naivity of wannabe smartasses while at the same time not even holding up to principles of science such as genuine objectivity. A soyence kid will for example keep preaching about how everything should be proven by reproducible experiments while at the same time accepting de facto irreproducible results, e.g. those obtained with billion dollar worth research performed at CERN which can NOT be reproduced anywhere else than at CERN with thousands of top scientist putting in years of work. Such results are not reproducible in practice, they are accepted on the basis of pure faith in those presenting it, just as religious people accept the words of preachers. The kid will argue that in theory someone else can build another CERN and reproduce the results, but that won't happen in practice, it's just a purely theoretical unrealistic scenario so his version of what "science" is is really based on reproducibility that only works in a dreamed up world, this kind of reproducibility doesn't at all fulfill its original purpose of allowing others to check, confirm or refute the results of experiments. This starts to play a bigger role when for example vaccines start to get promoted by the government as "proven safe by science" (read "claimed safe by a corporation who makes money off of people being sick"), the soyence kid will gladly accept the vaccine and fight for their acceptance just thanks to this label, not based on any truly scientific facts but out of pure faith in self proclaimed science authorities -- here the soyentist is relying purely on faith, a concept he would like to think he hates with his soul.

The "citation needed" craziness that indicates lack of any brain and pure reliance on the word of authority is seen e.g. on Wikipedia. Wikipedia doesn't accept original research, observation or EVEN LOGIC ITSELF as a basis for presenting something -- everything, even trivial claims have to have a "citation" from a source WITH mainstream political views (unpopular and controversial sources are banned); Wikipedia is therefore one big propaganda ground for those with power over the mainstream media.

Soyence relies on low IQ, shallow education and popular "science education" (e.g. neil de grass), while making its followers believe they are smart. It produces propaganda material such as "documentaries" with Morgan Freeman (i.e. people who are good at persuasion rather than being competent), series like The Big Bang Theory and YouTube videos with titles such as "Debunking Flat Earth with FACTS AND LOGIC", so there's a huge mass of NPCs thinking they are Einsteins who blindly support this cult. Soyence attacks science from within by attacking its core principles, i.e. it tries to ridicule and punish thinking outside the box and asking specific questions -- in this it is not dissimilar to a mass religion.

Examples of soyence:


speech_synthesis

Speech Synthesis

TODO

Example

This is a simple C program (using float for simplicity of demonstration) that creates basic vowel sounds using formant synthesis (run e.g. as gcc -lm program.c && ./a.out | aplay, 8000 Hz 8 bit audio is supposed):

#include <stdio.h>
#include <math.h>

double vowelParams[] = { // vocal tract shapes, can be found in literature
  // formant1  formant2  width1  width2  amplitude1 amplitude2
     850,      1650,     500,    500,    1,         0.2, // a
     390,      2300,     500,    450,    1,         0.9, // e
     240,      2500,     300,    500,    1,         0.5, // i 
     250,      600,      500,    400,    1,         0.9, // o
     300,      400,      400,    400,    1,         1.0  // u
  };

double tone(double t, double f) // tone of given frequency
{
  return sin(f * t * 2 * M_PI);
}

/* simple linear ("triangle") function for modelling spectral shape
   of one formant with given frequency location, width and amplitude */
double formant(double freq, double f, double w, double a)
{
  double r = ((freq - f + w / 2) * 2 * a) / w;

  if (freq > f)
    r = -1 * (r - a) + a;

  return r > 1 ? 1 : (r < 0 ? 0 : r);
}

/* gives one sample of speech, takes two formants as input, fundamental
   frequency and possible offset of both formants (can model "bigger/smaller
   head") */
double speech(double t, double fundamental, double offset,
  double f1, double f2,
  double w1, double w2,
  double a1, double a2)
{
  int harmonic = 1; // number of harmonic frequency

  double r = 0;

  /* now generate harmonics (multiples of fundamental frequency) as the source,
     and multiply them by the envelope given by formants (no need to deal with
     multiplication of spectra; as we're constructing the result from basic
     frequencies, we can simply multiply each one directly): */
  while (1)
  {
    double f = harmonic * fundamental;
    double formant1 = formant(f,f1 + offset,w1,a1);
    double formant2 = formant(f,f2 + offset,w2,a2);

    // envelope = max(formant1,formant2)
    r += (formant1 > formant2 ? formant1 : formant2) * 0.1 * tone(t,f);

    if (f > 10000) // stop generating harmonics above 10000 Hz
      break;

    harmonic++;
  }

  return r > 1.0 ? 1.0 : (r < 0 ? 0 : r); // clamp between 0 and 1
}

int main(void)
{
  for (int i = 0; i < 50000; ++i)
  {
    double t = ((double) i) / 8000.0;
    double *vowel = vowelParams + ((i / 4000) % 5) * 6; // change vowels

    putchar(128 + 127 *
      speech(t,150,-100,vowel[0],vowel[1],vowel[2],vowel[3],vowel[4],vowel[5]));
  }

  return 0;
}

splinternet

Splinternet

TODO


sqrt

Square Root

Square root (sometimes shortened to sqrt) of number a is such a number b that b^2 = a, for example 3 is a square root of 9 because 3^2 = 9. Finding square root is one of the most basic and important operations in math and programming, e.g. for computing distances, solving quadratic equations etc. Square root is a special case of finding Nth root of a number for N = 2. Square root of a number doesn't have to be a whole number; in fact if the square isn't a whole number, it is always an irrational number (i.e. it can't be expressed as a fraction of two integers, for example square root of two is approximately 1.414...); and it doesn't even have to be a real number (e.g. square root of -1 is i). Strictly speaking there may exist multiple square roots of a number, for example both 5 and -5 are square roots of 25 -- the positive square root is called principal square root; principal square root of x is the same number we get when we raise x to 1/2, and this is what we are usually interested in -- from now on by square root we will implicitly mean principal square root. Programmers write square root of x as sqrt(x) (which should give the same result as raising to 1/2, i.e. pow(x,0.5)), mathematicians write it as:

  _    1/2
\/x = x

TODO

Programming

TODO

Within desired precision square root can be relatively quickly computed iteratively by binary search. Of course if we need extreme speed, we may use a look up table with precomputed values.

TODO: C code for binary search

The following is a non-iterative approximation of integer square root in C that has acceptable accuracy to about 1 million (maximum error from 1000 to 1000000 is about 7%): { Painstakingly made by me. ~drummyfish }

int32_t sqrtApprox(int32_t x)
{
  return
    (x < 1024) ?
    (-2400 / (x + 120) + x / 64 + 20) :
      ((x < 93580) ?
       (-1000000 / (x + 8000) + x / 512 + 142) :
       (-75000000 / (x + 160000) + x / 2048 + 565));
}

ssao

SSAO

Screen space ambient occlusion (SSAO) is a screen space technique used in 3D computer graphics for approximating ambient occlusion (basically "dim shadows in corners", which itself is an approximation of true global illumination) in a way that's easy and not so expensive to implement to run in real time. The effect however looks ugly many times and is often criticized, see e.g. an excellent article at https://nothings.org/gamedev/ssao/.

{ 2023 report: SSAO still sucks. ~drummyfish }

Exact ambient occlusions can be computed with algorithms such as RTAO (which uses raytracing), but this requires complete information about the geometry and is too slow without special hardware. Therefore some game devs cheat and use a cheap approximation: SSAO is implemented as a post-processing shader and only uses the information available on the screen, specifically in the depth buffer -- this gives only partial information about the actual scene geometry, i.e. the algorithm doesn't know what the back facing, screen-perpendicular or off-screen geometry looks like and has to make guesses which sometimes result in quite visible inaccuracies.

This methods is notoriously ugly in certain conditions and many modern games suffer from this, even the supposedly "photorealistic" engines like Unreal -- if someone is standing in front of a wall there is a shadow outline around him that looks so unbelievably ugly you literally want to puke. But normie eyes can't see this lol, they think that's how reality looks and they are okay with this shit, they allow this to happen. Normies literally destroy computer graphics by not being able to see correctly.

What to do then? The most suckless way is to simply do no ambient occlusion -- seriously test how it looks and if it's okay just save yourself the effort, performance and complexity. Back in the 90s we didn't have this shit and games unironically looked 100 times better. You can also just bake the ambient occlusion in textures themselves, either directly in the color texture or use light maps. Note that this makes the ambient occlusions static and with light maps you'll need more memory for textures. Finally, if you absolutely have to use SSAO, at least use it very lightly (there are parameters you can lower to make it less prominent).


steganography

Steganography

Steganography means hiding secret information in publicly accessible data; for example it is possible to hide text messages in a digital photograph by slightly modifying the colors of the image pixels -- that photo then looks just like an innocent picture while in fact bearing an extra information for those who can read it. Steganography differs from encryption by trying to avoid even suspicion of secret communication.

There are many uses of steganography, for example in secret communication, bypassing censorship or secretly tracking a piece of digital media with an invisible watermark (game companies have used steganography to identify which game clients were used to leak pre-release footage of their games). Cicada 3301 has famously used steganography in its puzzles.

Steganography may need to take into account the possibility of the data being slightly modified, for example pictures exchanged on the Internet lose their quality due to repeating compressions, cropping and format conversions. Robust methods may be used to preserve the embedded information even in these cases.

Some notable methods and practices of steganography include:


stereotype

Stereotype

Stereotypes are general statistical observations about groups of people (such as different races) which have been discovered naturally and became part of common knowledge (without rigorous scientific effort). Stereotypes are good because they tell us what we may expect from different kinds of people. Of course no one, maybe with the exception of blonde women, is so stupid as to think stereotypes apply 100% of the times -- let us repeat they are STATISTICAL observations, they talk about probabilities.

Stereotypes are also good for showing us the diversity of human races and cultures. Pseudoleftists want to suppress awareness of stereotypes by calling them "offensive" or "discriminating", aiming for creating a sterile society without any differences, without any beauty of diversity. Do not support that, spread the knowledge of stereotypes.

Some stereotypes are:

{ WIP ~drummyfish }


steve_jobs

Steve Jobs

"I'm not glad he'd dead, but I'm glad he's gone." -- Richard Stallman

Steve Jobs was the prototypical evil CEO and co-founder of one of the worst corporations in the world: Apple. He was a psychopathic entrepreneur with a cult of personality that makes Americans cum. He was mainly known for his ability to manipulate people and he worsened technology by making it more consumerist, expensive and incompatible with already existing technology. All americans masturbate daily to Steve Jobs so he can also be considered the most famous US porn star.

{ LOL how come in the American movies the villain is always some rich boss of a huge corporation clearly resembling Steve Jobs, doing literally the same things, it's almost as if the average American actually somehow KNOWS and feels deep inside these people are pure evil, but suddenly outside of a Hollywood movie their brain switches to "aaaaah, that guy is amazing" and they just eat all his bullshit. I just can't comprehend this. ~drummyfish }

Jobs was born on February 24, 1955 and later was adopted which may have contributed to his development of psychopathy. He was already very stupid as a little child, he never really learned programming and was only interested in achieving what he wanted by crying and pressuring other people to do things for him. This translated very well to his adult life when he quit school to pursue money. He manipulated and abused his schoolmate Steve Wozniak, a hacker, to make computers for him. They started Apple in 1976 and started producing one of the first personal computers: Apple I and Apple II with which he won the capitalist lottery and unfortunately succeeded on the market. Apple became a big ass company, however Jobs was such shit CEO that Apple fired him lol. He went to do some other shit like NeXT. Then a bunch of things happened (TODO) and then, to the relief of the whole world, he died on October 5, 2011 from cancer. { LRS never wishes for anyone's death, here we only state the simple fact that the world is a better place without Jobs in it. ~drummyfish }


suckless

Suckless

Suckless, software that sucks less, is a type of free software, as well as an organization (http://suckless.org/), that tries to adhere to a high technological minimalism, freedom and hackability, and opposes so called bloat and unnecessary complexity which has been creeping into most "modern" software and by which technology has started to become less useful and more burdening. It is related to Unix philosophy and KISS but brings some new ideas onto the table. LRS builds on top of suckless ideas.

The community is relatively a small niche but has also seen a growth in popularity sometime in 2010s, thanks to tech youtubers such as Luke Smith, Distro Tube and Mental Outlaw. It has also gained traction on 4chan's technology board. While consisting a lot of expert programmers and hackers mostly interested in systems like GNU/Linux, BSDs and Plan 9, a lot of less skilled "Linux" users and even complete non-programmers have started to use suckless to various degrees -- dwm has for example seen a great success among "Unix porn" lovers and chronic ricers. While some members are hardcore minimalists and apply their principles to everything, some just cherry pick programs they find nice and integrate them in their otherwise bloated systems.

Suckless is pretty cool, it has inspired LRS, but watch out, as with most of the few promising things nowadays it is half cool and half shitty -- for example most suckless followers seem to be rightists and capitalists who are motivated by harmful goals such as their own increased productivity, not by altruism. LRS fixes this, we only take the good ideas of suckless.

{ From what it seems to me, the "official" suckless community is largely quiet and closed, leading conversations mostly on mailing lists and focusing almost exclusively on the development of their software without politics, activism and off topics, probably because they consider it bullshit that would only be distracting. There is also suckless subreddit which is similarly mostly focused on the software alone. They let their work speak. Some accuse the community of being Nazis, however I believe this is firstly irrelevant and secondly mostly false accusations of haters, even if we find a few Nazis among them, just as in any community. Most pro-suckless people I've met were actually true socialists (while Nazis are not socialist despite their name). Unlike tranny software, suckless software itself doesn't promote any politics, it is a set of purely functional tools, so the question of the developers' private opinions is unimportant here. Suckless ideas are good regardless of whose brains they came from. ~drummyfish }

Attributes

Notable attributes of suckless software include:

History

Suckless in current form has existed since 2006 when the domain suckless.org was registered by a German guy Anselm R. Garbe who is the founder of the community. It has evolved from a community centered around specific software projects, most notably wmii. Garbe has given interview about suckless in FLOSS Weekly episode 355.

Some time before 2010 suckless developed stali, a statically linked glibc-less "Linux distro" that was based on the idea that dynamic linking is harmful and that static linking is mostly advantageous. It also came with suckless software by default. This project was made independent and split from suckless in 2018 by Garbe.

In 2012 a core veteran member of suckless, a Spanish guy nicknamed Uriel, has killed himself and became a meme.

Projects

Notable projects developed by the suckless group include:

However there are many more (IRC clients, file formats, presentation software, ...), check out their website.

See Also


sudoku

Sudoku

Sudoku is a puzzle that's based on filling a grid with numbers that is hugely popular even among normies such as grandmas and grandpas who find this stuff in magazines for elderly people. The goal is to fill in all squares of a 9x9 grid, prefilled with a few clue digits, with digits 1 to 9 so that no digit repeats in any column, row and 3x3 subgrid. It is like a crosswords puzzle for people who lack general knowledge, but it's also pretty suckless, pure logic-based puzzle whose generation and solving can be relatively easily automatized (unlike generating crosswords which requires some big databases). The puzzle is a pretty fun singleplayer game, posing opportunities for nice mathematical research and analysis as well as a comfy programming exercise. Sudokus are a bit similar to magic squares. There also exist many similar kinds of puzzles that work on the principle of filling a grid so as to satisfy certain rules given initial clues, many of these are implemented e.g. in Simon Tatham's Portable Puzzle Collection.

Curiously sudoku has its origins in agricultural designs in which people wanted to lay out fields of different plants in more or less uniform distributions (or something like that, there are some papers about this from 1950s). The puzzle itself became popular in Japan in about 1980s and experienced a boom of popularity in the western world some time after 2000 (similar Asian puzzle boom was historically seen e.g. with tangram).

The following is an example of a sudoku puzzle with only the initial clues given:

 _________________
|  3 1|  5 7|  6  |
|     |9   8|  4  |
|4_7_8|6_ _2|1_ _5|
|7   5|  6  |4    |
|    6|  8 1|7 2  |
| _ _ |7_ _3|6_5_ |
|5 6  |  9  |    2|
|     |1   5|9   6|
| _ _3|8_2_6| _ _4|

The solution to the above is:

 _________________
|9 3 1|4 5 7|2 6 8|
|6 5 2|9 1 8|3 4 7|
|4_7_8|6_3_2|1_9_5|
|7 1 5|2 6 9|4 8 3|
|3 4 6|5 8 1|7 2 9|
|2_8_9|7_4_3|6_5_1|
|5 6 7|3 9 4|8 1 2|
|8 2 4|1 7 5|9 3 6|
|1_9_3|8_2_6|5_7_4|

We can see neither digit in the solution repeats in any column, row and any of the 9 marked 3x3 subgrids or, in other words, the digits 1 to 9 appear in each column, row and subgrid exactly once. These are basically the whole rules.

We generally want a sudoku puzzle to have initial clues such that there is exactly one possible (unique) solution. For this sudoku has to have at least 17 clues (this was proven by a computer). Why do we want this? Probably because in the puzzle world it is simply nice to have a unique solution so that human solvers can check whether they got it right at the back page of the magazine. This constraint is also mathematically more interesting.

How many possible sudokus are there? Well, this depends on how we view the problem: let's call one sudoku one grid completely filled according to the rules of sudoku. Now if we consider all possible such grids, there are 6670903752021072936960 of them. However some of these grids are "basically the same" because we can e.g. swap all 3s and 5s in any grid and we get basically the same thing as digits are nothing more than symbols here. We can also e.g. flip the grid horizontally and it's basically the same. If we take such things into account, there remain "only" 5472730538 essentially different sudokus.

Sudoku puzzles are sometimes assigned a difficulty rating that is based e.g. on the techniques required for its solving.

Of course there exist variants of sudoku, e.g. with different grid sizes, extended to 3D, different constraints on placing the numbers etc.

Solving Sudoku

There are two topics to address: solving sudoku by people and solving sudoku by computers.

Humans almost exclusively use logical reasoning techniques to solve sudoku, which include:

For computers the traditional 9x9 sudoku is nowadays pretty easy to solve, however solving an NxN sudoku is an NP complete problem, i.e. there most likely doesn't exist a "fast" algorithm for solving a generalized NxN sudoku, even though the common 9x9 variant can still be solved pretty quickly with today's computers by using some kind of "smart" brute force, for example backtracking (or another state tree search) which recursively tries all possibilities and at any violation of the rules gets one step back to change the previous number. Besides this a computer can of course use all the reasoning techniques that humans use such as creating sets of possible values for each square and reducing those sets until only one possibility stays. The approach of reasoning and brute forcing may also be combined: first apply the former and when stuck fall back to the latter.

Generating Sudoku

{ I haven't personally tested these methods yet, I'm just writing what I've read on some web pages and ideas that come to my mind. ~drummyfish }

Generating sudoku puzzles is non-trivial. There are potentially many different algorithms to do it, here we just foreshadow some common simple approaches.

Firstly we need to have implemented basic code for checking the validity of a grid and also some automatic solver, e.g. based on backtracking.

For generating a sudoku we usually start with a completely filled grid and keep removing numbers to leave only a few ones that become the initial clues. For this we have to know how to generate the solved grids. Dumb brute force (i.e. generating completely random grids and testing their validity) won't work here as the probability of finding a valid grid this way is astronomically low (seems around 10^(-56)). What may work is to randomly fill a few squares so that they don't break the rules and then apply our solver to fill in the rest of the squares. Yet a simpler way may be to have a database of a few hand-made grids, then we pick on of them and apply some transformations that keep the validity of the grid which include swapping any two columns, swapping any two rows, tansposing, flipping the grid, rotating it 90 degrees or swapping any two digits (e.g. swap all 7s with all 9s).

With having a completely filled grid generating a non-unique (more than one solution) sudoku puzzle is trivial -- just take some completely filled grid and remove a few numbers. But as stated, we usually don't want non-unique sudokus.

For a unique solution sudoku we have to check there still exists exactly one solution after removing any numbers from the grid, for which we can again use our solver. Of course we should optimize this process by quitting the check after finding more than one solution, we don't need to know the exact count of the solutions, only whether it differs from one.

The matter of generating sudokus is further complicated by taking into account the difficulty rating of the puzzle.


suicide

Suicide

{ I hate disclaimers but I'm not advising you to commit fucking suicide, OK? I mean it's an option and sometimes it's the best option, but I want you to live if it's at least a little possible -- remember, LRS loves all life and all life is precious. We will all die, no need to rush it. Also if you're feeling like shit you can send me a mail, we can talk. ~drummyfish }

Suicide is when someone voluntarily kills himself. Suicide offers an immediate escape from capitalism and is therefore a kind of last-resort hope; it is one of the last remaining freedoms in this world, even though capitalists can't profit from dead people and so are working hard on preventing people from killing themselves (rather than trying to make them NOT WANT TO kill themselves of course).


sw

Software

Software (SW) are programs that run on a computer, i.e. its non-physical parts (as opposed to hardware); for example an operating system, the internet browser etc. Software is created by programming.

Usually we can pretty clearly say what is software and what is hardware, but there are cases where it's debatable. Normally software is that about the computer which can relatively easily be changed (i.e. reinstalled by a typing a few commands or clicking a few buttons) while hardware is hard-wired, difficult to modify, and not expected or designed to be modified. Nevertheless e.g. some firmware is kind of software in form of instructions which is however many times installed in some special kind of memory that's difficult to reprogram and not expected to be reprogrammed often -- some software may be "burned in" into a circuit so that it could only be changed by physically rewiring the circuit (the ME spyware in Intel CPUs has a built-in minix operating system). And this is where it may sometimes be difficult to decide where the line is drawn. This issue is encountered e.g. by the FSF which certifies some hardware that works with free software as "Respects Your Privacy" (RYF), and they have very specific definition what to them classifies software.


sw_rendering

Software Rendering

Sofware (SW) rendering refers to rendering computer graphics without the help of graphics card (GPU), i.e. computing images only with CPU. This mostly means rendering 3D graphics but can also refer to other kinds of graphics such as drawing fonts or video. Before GPUs were invented, all rendering was done in software, of course -- games such as Quake or Thief were designed with SW rendering and only added optional GPU acceleration later. SW rendering for traditional 3D graphics is also called software rasterization, as rasterization is the basis of current real-time 3D graphics.

SW rendering has advantages and disadvantages, though from our point of view its advantages prevail (at least given only capitalist GPUs exist nowadays). Firstly it is much slower than GPU graphics -- GPUs are designed to perform graphics-specific operations very quickly and, more importantly, they can process many pixels (and other elements) in parallel, while a CPU has to compute pixels sequentially one by one and that in addition to all other computations it is otherwise performing. This causes a much lower FPS in SW rendering. For this reasons SW rendering is also normally of lower quality (lower resolution, nearest neighbour texture filtering, ...) to allow workable FPS. Nevertheless thanks to the ginormous speeds of today's CPUs simple fullscreen SW rendering can be pretty fast on PCs and achieve even above 60 FPS; on slower CPUs (typically embedded) SW rendering is usable normally at around 30 FPS if resolutions are kept small.

On the other hand SW rendering is more portable (as it can be written purely in a portable language such as C), less bloated and eliminates the dependency on GPU so it will be supported almost anywhere as every computer has a CPU, while not all computers (such as embedded devices) have a GPU (or, if they have it, it may not be sufficient, supported or have a required driver). SW rendering may also be implemented in a simpler way and it may be easier to deal with as there is e.g. no need to write shaders in a special language, manage transfer of data between CPU and GPU or deal with parallel programming. SW rendering is the KISS approach.

SW rendering may also utilize a much wider variety of rendering techniques than only 3D rasterization traditionally used with GPUs and their APIs, thanks to not being limited by hard-wired pipelines, i.e. it is more flexible. This may include splatting, raytracing or BSP rendering (and many other "pseudo 3D" techniques).

A lot of software and rendering frameworks offer both options: accelerated rendering using GPU and SW rendering as a fallback (in case the first option is not possible). Sometimes there exists a rendering API that has both an accelerated and software implementation (e.g. TinyGL for OpenGL).

For simpler and even somewhat more complex graphics purely software rendering is mostly the best choice. LRS suggests you prefer this kind of rendering for its simplicity and portability, at least as one possible option. On devices with lower resolution not many pixels need to be computed so SW rendering can actually be pretty fast despite low specs, and on "big" computers there is nowadays usually an extremely fast CPU available that can handle comfortable FPS at higher resolutions. There is a LRS software renderer you can use: small3dlib.

SW renderers are also written for the purpose of verifying rendering hardware, i.e. as a reference implementation.

Note that SW rendering doesn't mean our program is never touching GPU at all, in fact most personal computers nowadays require some kind of GPU to even display anything. SW rendering only means that computation of the image to be displayed doesn't use any hardware specialized for this purpose.

Some SW renderers make use of specialized CPU instructions such as MMX which can make SW rendering faster thanks to handling multiple data in a single step. This is kind of a mid way: it is not using a GPU per se but only a mild form of hardware acceleration. The speed won't reach that of a GPU but will outperform a "pure" SW renderer. However the disadvantage of a hardware dependency is still present: the CPU has to support the MMX instruction set. Good renderers only use these instructions optionally and fall back to general implementation in case MMX is not supported.

Programming A Software Rasterizer

{ In case small3dlib is somehow not enough for you :) ~drummyfish }

Difficulty of this task depends on features you want -- a super simple flat shaded (no textures, no smooth shading renderer is relatively easy to make, especially if you don't need movable camera, can afford to use floating point etc. See the details of 3D rendering, especially how the GPU pipelines work, and try to imitate them in software. The core of these renderers is the triangle rasterization algorithm which, if you want, can be very simple -- even a naive one will give workable results -- or pretty complex and advanced, using various optimizations and things such as the top-left rule to guarantee no holes and overlaps of triangles. Remember this function will likely be the performance bottleneck of your renderer so you want to put effort into optimizing it to achieve good FPS. Once you have triangle rasterization, you can draw 3D models which consist of vertices (points in 3D space) and triangles between these vertices (it's very simple to load simple 3D models e.g. from the obj format) -- you simply project (using perspective) 3D position of each vertex to screen coordinates and draw triangles between these pixels with the rasterization algorithm. Here you need to also solve visibility, i.e. possible overlap of triangles on the screen and correctly drawing those nearer the view in front of those that are further away -- a very simple solution is a z buffer, but to save memory you can also e.g. sort the triangles by distance and draw them back-to-front (painter's algorithm). You may add a scene data structure that can hold multiple models to be rendered. If you additionally want to have movable camera and models that can be transformed (moved, rotated, scaled, ...), you will additionally need to look into some linear algebra and transform matrices that allow to efficiently compute positions of vertices of a transformed model against a transformed camera -- you do this the same way as basically all other 3D engines (look up e.g. some OpenGL tutorials, see model/view/projection matrices etc.). If you also want texturing, the matters get again a bit more complicated, you need to compute barycentric coordinates (special coordinates within a triangle) as you're rasterizing the triangle, and possibly apply perspective correction (otherwise you'll be seeing distortions). You then map the barycentrics of each rasterized pixel to UV (texturing) coordinates which you use to retrieve specific pixels from a texture. On top of all this you may start adding all the advanced features of typical engines such as acceleration structures that for example discard models that are completely out of view, LOD, instancing, MIP maps and so on.

Possible tricks, cheats and optimizations you may utilize include:

Specific Renderers

These are some notable software renderers:

See Also


systemd

Systemd

Systemd, also shitstemd, is a horribly disastrous bloated, anti-Unix, "FOSS" "software suite" used for initialization of an operating system and handling services like logging in or managing network connections. It is a so called PID 1 process, or an init system. Systemd has been highly criticised by the proponents of suckless and LRS and even normies for its enormous amount of bloat, ugliness, anti-Unix-philosophy design, feature creep, security vulnerabilities and other stuff. Unfortunately it is being adopted by many GNU/Linux distributions including Arch Linux and Debian. Some distros such as Devuan just said no to this shit and forked to a non-systemd version.

Systemd was born when Harry Pot... ummm Lennart Poettering had an unprotected gay sex with Kay Sievers.

For more detailed bashing of systemd see e.g. https://nosystemd.org/. The site sums up systemd with a fitting quote: "If this is the solution, I want my problem back". Another sum up by suckless: http://suckless.org/sucks/systemd/. There is also e.g. https://sysdfree.wordpress.com/.


tangram

Tangram

{ I made a simple tangram game in SAF, look it up if you want to play some tangram. ~drummyfish }

Tangram is a simple, yet greatly amusing old puzzle game in which the player tries to compose a given shape (of which only silhouette is seen) out of given basic geometric shapes such as triangles and squares. It is a rearrangement puzzle. Many thousands of shapes can be created from just a few geometric shapes, some looking like animals, people and man made objects. This kind of puzzles have been known for a long time -- the oldest recorded tangram is Archimedes' box (square divided into 14 pieces), over 2000 years old. In general any such puzzle is called tangram, i.e. it is seen as a family of puzzle games, however tangram may also stand for modern tangram, a one with 7 polygons which comes from 18th century China and which then became very popular also in the west and even caused a so called "tangram craze" around the year 1818. Unless mentioned otherwise, we will talk about this modern version from now on.

 _________________
|\_     big     _/|
|  \_   tri   _/  |
|tri_\_     _/    |
| _/   \_ _/  big |
|<_ sqr _X_   tri |
|  \_ _/tri\_     |
|mid \_______\_   |
| tri  \_ para \_ |
|________\_______\|

Divide square like this to get the 7 tangram pieces. Note that the parallelogram is allowed to be flipped when creating shapes as it has no mirror symmetry (while all other shapes do).

LRS considers tangram to be one of the best games as it is extremely simple to make and learn, it has practically no dependencies (computers, electricity, ... one probably doesn't even have to have the sense of sight), yet it offers countless hours of fun and allows deep insight, there is art in coming up with new shapes, math in counting possibilities, good exercise in trying to program the game etc.

Tangram usually comes as a box with the 7 pieces and a number of cards with shapes for the player to solve. Each card has on its back side a solution. Some shape are easy to solve, some are very difficult.

          _/|              
         /  |                           _/|\_
        |\_ |                          /  |  \_
        |  \|                         |   |    \_
        | _/                      _   |   |______\
        |/ \_  ________         _/ \_ | _/ \_
       /     \/       /       _/     \|/     \
       \_   _/_______/      _/         \_   _/
         \_/|              /_____________\_/   
        _/  |                  |         _/
      _/    |                  |       _/
    _/      |                  |     _/
  _/        |                  |   _/
 /__________|                  | _/
 \_         |                  |/|
   \_       |                 /  |
     \_     |                 \_ | 
     _/\_   |                  |\|
   _/    \_ |                  |  \_
  /________\|                  |____\            

Two tangram shapes: bunny and stork (from 1917 book Amusements in Mathematics).

{ I found tangram to be a nice practice for letting go of ideas -- sometimes you've got an almost complete solution that looks just beautiful, it looks like THE only one that just has to be it, but you can't quite fir the last pieces. I learned that many times I just have to let go of it, destroy it and start over, usually there is a different, even more beautiful solution. This experience may carry over to practical life, e.g. programming. ~drummyfish }

Can tangram shapes be copyrighted? As always nothing is 100% clear in law, but it seems many tangram shapes are so simple to not pass the threshold of originality for copyright. Furthermore tangram is old and many shapes have been published centuries ago, making them public domain, i.e. if you find some old, public domain book (e.g. the book The Fashionable Chinese Puzzle, Amusement in Mathematics or Ch'i ch'iao hsin p'u: ch'i chiao t'u chieh) with the shape you want to use, you're most definitely safe to use it. HOWEVER watch out, a collection of shapes, their ordering and/or shapes including combinations of colors etc. may be considered non-trivial enough to spawn copyright (just as collections of colors may be copyrightable despite individual colors not being copyrightable), so do NOT copy whole shape collections.

Tangram paradoxes are an interesting discovery of this game -- a paradox is a shape that looks like another shape with added or substracted piece(s), despite both being composed of the same pieces. Of course geometrically this isn't possible, the missing/extra area is always compensated somewhere, but to a human eye this may be hard to spot (see also infinite chocolate). New players get confused when they encounter a paradox for the first time, they think they solved the problem but are missing a piece, or have an extra one, while in fact they just made a wrong shape. TODO: example

Tips for solving:

TODO: some PD shapes, math, stats, ...


tas

Tool Assisted Speedrun

Tool assisted speedrun (TAS, also more generally tool assisted superplay) is a category of game speedruns in which help of any tools is allowed, even those that would otherwise be considered cheating, e.g. scripts, savestates, aimbots or time manipulation, however NOT those that alter the game itself. This makes it possible to create flawless, perfect or near-perfect runs which can serve as a theoretical upper limit for what is achievable by humans -- and of course TAS runs are pretty fun to watch. The normal, non-TAS runs are called RTA (real time attack). For example the current (2022) RTA world record of Super Mario Bros is 4.58.881 while the TAS record is 4.41.27.

{ Watching a TAS is kind of like watching the God play the game. I personally like to watch Trackmania TASes, some are really unbelievable. Elastomania and Doom TASes are also pretty fucked up. Also note that SAF games and Anarch have TAS support. ~drummyfish }

There is a website with videos of game TASes: https://tasvideos.org/.

TAS does NOT allow hacking the game in other ways than what's possible to achieve by simply playing the game, i.e. it is not possible to hex edit the game's code before running it or manipulate its RAM content at run time with external tools. However note that some games are buggy and allow things such as altering their RAM content or code by merely playing the game (e.g. Pokemon Yellow allows so called arbitrary code execution), which generally IS allowed. The goal of TAS is merely to find, as best as we can, the series of game inputs that will lead to completing the game as fast as possible. For this the game pretty much needs to be deterministic, i.e. the same sequence of inputs must always reproduce the same run when replayed later.

TAS runs coexist alongside RTA (non-TAS) runs as separate categories that are beneficial to each other: RTA runners come up with speedrunning techniques that TAS programmers can perfectly execute and vice versa, TAS runners many times discover new techniques and ideas for RTA runners (for example the insane discovery of groundbreaking noseboost when TAS was introduced to Trackmania). In fact RTA and TAS runners are many times the very same people. Of course if you submit a TAS run in RTA category, you'll be seen as a cheater.

Creating a TAS is not an easy task, it requires great knowledge of the game (many times including its code) and its speedrunning, as well as a lot of patience and often collaboration with other TASers. TASes are made offline (not in real time), i.e. hours of work are required to program minutes or even seconds of the actual run. Many paths need to be planned and checked. Compared to RTAs, the focus switches from mechanical skills towards skillful mathematical analysis and planning. Besides this some technological prerequisites are necessary: the actual tools to assist with creation of the TAS. For many new proprietary games it is extremely difficult to develop the necessary tools as their source code isn't available, their assembly is obscured and littered with "anti-cheating" malware. Many "modern" (even FOSS) games are additionally badly programmed and e.g. lacking a deterministic physics, which makes precise TASing almost impossible (as the traditional precise crafting of inputs requires deterministic behavior). The situation is better with old games that are played in emulators such as DOS games (Doom etc.) or games for consoles like GameBoy -- emulators can give us a complete control over the environment, they allow to save and load the whole emulator state at any instant, we may slow the time down arbitrarily, rewind and script the inputs however we wish (an advanced technique includes e.g. bruteforcing: exhaustively checking all possible combinations of inputs over the following few frames to see which one produces the best time save). In games that don't have TAS tools people at least try to do the next best thing with segmented speedruns.

There also exists a term tool assisted superplay which is the same principle as TAS but basically with the intention of just flexing, without the goal of finishing the game fast (e.g. playing a Doom level against hundreds of enemies without taking a single hit).

Some idiots are against TASes for various reasons, mostly out of fear that TASers will use the tools to CHEAAAAAAAT in RTAs or that TASes will make the human runners obsolete etc. That's all bullshit of course, it's like being against computers out of fear they would make human calculators obsolete. Furthermore TASes always coexist perfectly peacefully with RTA runs as can e.g. be seen in the case of Trackmania -- in 2021 TAS tools started to appear for Trackmania and many people feared it would kill the game's competition, however after the release of the tools no such disaster happened, TAS became hugely popular and now everyone loves it, human competition happily continues, plus the development of the tools actually helped uncover many cheaters among the top players (especially Riolu who was forced to leave the scene, this caused a nice drama in the community).

We could even go as far as to say that morally TAS is the superior way of speedrunning as it puts humans in the role or thinkers rather than treating them as wannabe machines who waste enormous amounts of time on grinding real time runs with arbitrary obstacles (such as requiring a run to not be spliced etc.), which a real machine can simply do instantly and perfectly. There is really no point in someone spending 10000 hours of life on getting lucky and nailing a series of frame perfect keypresses when a computer can do this in 1 second.


tattoo

Tattoo

Tattoo is a body disfigurement formed by injecting ink under the skin to permanently mark it. Tattoo, similarly to piercing, suits, dyed hair etc., is a sign of egoism, narcissism, herd mentality, identity crisis, overconfidence of the incompetent and a cheap attempt at desperately trying to get attention or make oneself look interesting. We highly advise to distance oneself from anyone having a voluntarily made tattoo.


tech

Tech

Tech is a short for technology.


technology

Technology

Technology (from Greek tekhnologia, "systematic treatment of art", also just "tech") encompasses tools and knowledge of making such tools invented and achieved mainly with the help of science and by long systematic effort. This includes everything from stone tools to space rockets and artificial intelligence. On the Internet, as well as on this Wiki, this term is often used with the focus on computer technology, i.e. hardware and software, as this is the kind of technology that is being discussed and developed the most in our days. Technology, like fire, should serve us, but can also be dangerous and often gets misused and abused.

The foremost purpose of technology is to make people not have to work. Proponents of dystopian societies, such as capitalists, are afraid of technology "taking people's work" -- such people are for sure greatly idiotic and often end up abusing technology in the completely opposite manner: for enslaving and oppressing people. Proponents of good technology correctly try to make technology do work for humans so that people can actually live happy lives and do what they want. With this in mind we have to remember that one of the most important concepts in technology is minimalism, as that is a necessary prerequisite for technological freedom.

Knowledge of older technology gets lost extremely quickly in society -- this is a very crucial realization that follow a naive idea of a young man who usually believes that we somehow pertain knowledge of all technology that's been invented from dawn of man until today. In history our society has always only held knowledge of technology it was CURRENTLY ACTIVELY USING; knowledge of decades outdated technology only stays in hands of extremely few individuals and perhaps in some obscure books which ARE UNREADABLE to most, sometimes to none; yet older technology often gets forgotten completely. For example renaissance had to largely reinvent many arts and sciences of making building and statues of antiquity because middle ages have simply forgotten them. A more recent example can be found at NASA and their efforts to recreate THEIR OWN old rocket engines: you would think that since they literally have detailed documentation of those engines, they'd be able to simple make them again, but that's not the case because the small undocumented (yet crucial) know-how of the people who built the engines decades ago was lost with those individuals who died or retired in the meanwhile; NASA had to start a ginormous project to reinvent its own relatively recent technology. The same is happening in the field of programming: modern soydevs just CANNOT create as efficient software as hackers back then as due to normalization of wasting computing resources they threw away the knowledge of optimization technique and wisdom in favor of bullshit such as "soft skills" and memorizing one billion genders and personal pronouns. One might naively think that e.g. since our agriculture is highly efficient and advanced due to all the immense complexity of our current machines, simple farming without machines would be a child's play for us, however the opposite is true: we no longer know how to farm without machines. If a collapse comes, we are simply fucked.


ted_kaczynski

Ted Kaczynski

"The Industrial Revolution and its consequences have been a disaster for the human race." --Ted Kaczynski

Ted Kaczynski (22.5.1942 - 10.6.2023, RIP), known as Unabomber, was an imprisoned American mathematician who lived a simple life in the nature, warned of the dangers of advanced technology and killed several people by mailing them bombs in order to bring attention to his manifesto that famously starts with the words "The Industrial Revolution and its consequences have been a disaster for the human race". Besides being one of the most famous mass murderers he is very well known in the tech community.

Ted was born in Chicago, US. As a kid he was very shy. He was also extremely smart (IQ measured at 167), skipped a few grades, graduated from Harvard at 20 years old and got a PhD at 25 at the University of Michigan. Then he became a professor at the University of California, until his resignation in 1969.

Fun fact: at one point he considered a gender change surgery.

In 1971 he moved to a remote cabin in the woods in Montana where he lived in a primitive way with no electricity or running water. He grew more and more disenchanted with the society, especially with its technology and how it's enslaving and destroying humanity. The last straw may have been the moment when a road was built nearby his cabin, in the middle of the nature he loved.

He started sending hand-made bombs to various universities and airports (hence the nickname Unabomber, university and airline bomber). He managed to kill 3 people and injured dozens of others. He was arrested on April 3, 1996 in his cabin. He got life imprisonment in court. He died in prison in 2023 after having been diagnosed with cancer, reports said he committed suicide.

Manifesto

The manifesto is named Industrial Society and Its Future. In it he refers to his movement as a Freedom Club (FC). Let's start by summarizing it:

{ The following is a sum up according to how I personally understood it, there's most likely subjective bias but I did my best. ~drummyfish }

First he bashes "leftists", analyses their psychology and says they are kind of degenerate sheeple, characterized by low self esteem, inventing bullshit artificial issues (such as the issue of political correctness), sometimes using violence. He also criticizes conservatives for supporting technological and economical growth which in his view inevitably brings on shift in societal values and said degeneracy. The usual societal issues are presented such as bad mental health, people being slaves to the system, feeling powerless, having no security, no autonomy etc. The cause of unhappiness and other human issues is identified as people not being able to fulfill what he sees as a necessity for fulfilling life, so called power process, the process of considerable struggle towards a real goal that can be achieved such as obtaining food by hunting -- he argues nowadays it's "too easy" to satisfy these basic needs and people invent artificial "surrogate" activities (such as sports, activism and even science) to do to try to fulfill the power process, however he sees these artificial activities as harmful, not real goals. It is mentioned we only have freedom in unimportant aspects of life, the system controls and regulates everything, brainwashes people etc. He defines real freedom as the opportunity to go through the power process naturally and being in control of one's circumstances. It is talked a lot about modification of humans themselves, either by advanced psychological means (propaganda), drugs or genetic modification which is seen as a future danger. A number of principles by which society works is outlined and it is concluded that the industrial society can't be reformed, a revolution is needed (not necessarily violent). Ted argues the system needs to be destroyed, we have to get back to the nature, and for this revolution he outlines a plan and certain recommendations (creation of ideology for intellectuals and common folk, the necessity of the revolution being world-wide etc.). He ends with again bashing "leftism" and warns they must never be collaborated with.

Now Let us leave a few comments on the manifesto. Firstly we have to say the text is easy to read, well thought through and Ted makes some great points, many of which we completely agree; this includes the overall notion of technology having had mostly negative effects on recent society, the pessimistic view of our future and the criticism of "harmful modern bullshit" such as political correctness. He analyzes and identifies some problems in society very well (e.g. the propaganda that's so advanced that even its creators aren't usually consciously aware they're creating propaganda, his analysis of the inner working of the system is spot on). Nevertheless we also disagree on many points. Firstly we use different terminology; people who Ted calls leftist and whom he accuses of degeneracy and harmfulness we call pseudoleftists, we believe in a truly leftist society (i.e. nonviolent, altruistic, non-censoring, loving without fascist tendencies). We disagree on Ted's fundamental assumption that people can't change, i.e. that people are primitive animals that need to live primitive lives (go through the power process by pursuing real goals such as obtaining food by hunting) in order to be happy (we are not against primitivism but we support it for other reasons). We believe society can become adult, just like an individual, if it is raised properly (i.e. with effort) and that the primitive side of a human can be overshadowed by the the intellectual side and that activities he calls surrogate (and considers undesirable) can be fulfilling. We think that in a sane, adult society advanced technology can be helpful and compatible with happy, fulfilling lives of people, even if the current situation is anything but. And of course, we are completely nonviolent and disagree with murdering people for any reason such as bringing attention to a manifesto.


teletext

Teletext

Teletext is now pretty much obsolete technology that allowed broadcasting extremely simple read-only text/graphical pages along with TV signal so that people could browse them on their TVs. It was used mostly in the 70s, 80s and 90s but with world wide web teletext pretty much died.

{ Just checked on my TV and it still works in 2022 here. For me teletext was something I could pretend was "the internet" when I was little and when we didn't have internet at home yet, it was very cool. Back then it took a while to load any page but I could read some basic news or even browse graphical logos for cell phones. Nowadays TVs have buffers and have all the pages loaded at any time so the browsing is instantaneous. ~drummyfish }

The principal difference against the Internet was that teletext was broadcast, i.e. it was a one-way communication. Users couldn't send back any data or even request any page, they could only wait and catch the pages that were broadcast by TV stations (this had advantages though, e.g. it couldn't be DDOSed and it couldn't spy on its users as they didn't send any information back). Each station would have its own teletext with fewer than 1000 pages -- the user would write a three place number of the page he wanted to load ("catch") and the TV would wait until that page was broadcast (this might have been around 30 seconds at most), then it would be displayed. The data about the pages were embedded into unused parts of the TV signal.

The pages allowed fixed-width text and some very blocky graphics, both could be colored with very few basic colors. It looked like something you render in a very primitive terminal.

See Also


temple_os

Temple OS

Temple OS is a funny operating system made by a schizo guy Terry Davis who has become a meme and achieved legendary status for this creation in the Internet tech circles as it's extremely impressive that a single man creates such a complex OS and also the OS features and the whole context of its creation are quite funny. It has a website at https://templeos.org.

According to Terry, God commanded him to write TempleOS and guided him in the development: for example it was demanded that the resolution be 640x480. It is written in HolyC, Terry's own programming language. The OS comes with GUI, 2D and 3D library, games and even a program for communicating with God.

Notable Temple OS features and programs are:

In his video blogs Terry talked about how technology became spoiled and that TempleOS is supposed to be simple and fun. For this and other reasons the OS is limited in many way, for example:

Temple OS source code has over 100000 LOC. It is publicly available and said to be in the public domain, however there is no actual license/waiver in the repository besides some lines such as "100% public domain" which are legally questionable and likely ineffective (see licensing).


tensor_product

Tensor Product

TODO

a (x) b = [a0 * b0, a0 * b1, a0 * b2, ... a1 * b0, a1 * b1, ... an * b0, an * b1, ...]


terry_davis

Terry Davis

"An idiot admires complexity, a genius admires simplicity." --Terry Davis

Terry A. Davis, aka the divine intellect, born 1969 in Wisconsin, was a genius+schizophrenic programmer that singlehandedly created TempleOS in his own programming language called HolyC, and greatly entertained and enlightened an audience of followers until his tragic untimely death. For his programming skills and quality videos he became a legend and a meme in the tech circles, especially on 4chan which additionally valued his autistic and politically incorrect behavior.

He was convinced he could talk to God and that God commanded him to make an operating system with certain parameters such as 640x480 resolution, also known as the God resolution. According to himself he was gifted a divine intellect and was, in his own words, the "best programmer that ever lived". Terry was making YouTube talking/programming videos in which God was an often discussed topic, alongside valuable programming advice and a bit of good old racism. He was also convinced that the government was after him and often delved into the conspiracies against him, famously proclaiming that "CIA niggers glow in the dark" ("glowing in dark" subsequently caught on as a phrase used for anything suspicious). He was in mental hospital several times and later became homeless, but continued to post videos from his van. An entertaining fact is also that he fell in love with a famous female physics YouTuber Dianna Cowern which he stalked online. In 2018 he was killed by a train (officially a suicide but word has it CIA was involved) but he left behind tons of videos full of endless entertainment, and sometimes even genuine wisdom.

Terry, just as us, greatly valued simplicity and fun in programming, he was a low-level programmer and saw that technology went to shit and wanted to create something in the oldschool style, and he expressed his will to dedicate his creation to the public domain. This is of course extremely based and appreciated by us (though the actual public domain dedication wasn't executed according to our recommendations).


throwaway_script

Throw Away Script

Throw away script is a script for one-time job which you "throw away" after its use. Such scripts may be ugly, badly written and don't have to follow LRS principles as their sole purpose is to quickly achieve something without any ambition to be good, future-proof, readable, reusable etc.

For example if you have a database in some old format and want to convert it to a new format, you write a throw away script to do the conversion, or when you're mocking an idea for a game, you write a quick throw away prototype in JavaScript to just test how the gameplay would feel.

For throw aways cripts it is acceptable and often preferrable to use languages otherwise considered bad such as Python or JavaScript.


tom_scott

Tom Scott

TODO

LOL this idiot tries to take some kind of ownership of ideas in a video called 14 science fiction stories in under 6 minutes (iQGl-ffVtaM), he just comes up with story ideas and says people have to have his permission for commercial use, even though it's probably not legally possible to own ideas like this. Fucking fascist. Do not support.


toxic

Toxicity

A social environment is said to be toxic if it induces a high psychological discomfort, toxic individuals are members of such environment that help establish it. Examples of toxic environments include a capitalist society and SJW social networks.


tranny_software

Tranny Software

Tranny software is a harmful software developed within and infected by the culture of the toxic LGBTFJJJGIIQWWQW SJW pseudoleftists, greatly characterized e.g. by codes of conduct, bad engineering and excluding straight white males from the development in the name of "inclusivity". Such software is retarded. It is practically always a high level of bloat with features such as censorship, bugs and spyware.

To be clear, tranny software does NOT stand for software written by transsexuals, it stands for a specific kind of software infected by fascism (in its features, development practices, culture, goals etc.) which revolves around things such as sexual identity. Of course with good technology it doesn't matter by whom it is made. For god's sake do NOT bully individuals for being transsexual! Refuse bad software.

Some characteristics of tranny software are:

Examples of tranny software are:

Example of software that doesn't seem to be tranny software:


transistor

Transistor

Transistor is a small semiconductor element of electronic circuits that can be used as an amplifier or a switch, and which is a basic building block of digital electronic computers, integrated circuits and many other electronic devices. Transistors replaced vacuum tubes and relays which were used in primitive computers of the 20th century; transistors can be made much smaller, cheaper, more reliable and, unlike relays, operated purely electronically and therefore much faster. Transistor has become the most manufactured device in history.

Transistor generally has three terminals. Its key principle is that of behaving like an electronically operated amplifier or switch: we can make a transistor open or close (i.e. conduct or not conduct electricity) by applying different voltage or current (and we can also make it something between open and close). The voltage/current by which we control the transistor can be lower than that which we control, so we can see this as an amplifier: we can control high current with low current, i.e. the high current follows the low current but has higher amplitude. We can also see this as a switch: by applying voltage/current we can make a wire connect (low resistivity) or disconnect (high resistivity) similarly to a physical switch. This switch behavior is important for computers because we can exploit it to implement binary (on/off) logic circuits.

A basic division of transistors is following:

Commonly used graphical symbols for transistor are (usually in a circle):

        E            E           D           D
        |            |           |           |
B___|.-'     B___|.-'     G   |--'    G   |--'
    |'>.         |'<.     -->-|--.    --<-|--.
        |            |           |           |
        C            C           S           S

BJT (NPN)    BJT (PNP)    FET (N)     FET (P)        

First FET transistors were JFETs (junction-gate FET) but by today were mostly replaced by MOSFETs (metal-oxide-semiconductor FET), a transistor using a metal oxide layer for separating the gate terminal which gives it some nice properties over JFET. These transistors are used to implement logic gates e.g. using the CMOS fabrication process which uses complementary pairs of P and N channel FETs so that e.g. one is always off which decreases power consumption.


triangle

Triangle

Triangle is a three sided polygon, one of the most basic geometric shapes. It is a convex shape. Furthermore it is a 2-simplex, i.e. the simplest "shape composed of sides" in 2 dimensions. Triangles are very important, they for example help us to compute distances or define functions like sine and cosine (see trigonometry).

{ In my favorite book Flatland triangles represent the lowest class of men with isoscele triangles being the lowest as they are most similar to women who are just straight lines. ~drummyfish }

Triangle consists of three vertices (usually labeled A, B and C), three sides (usually labeled a, b and c according to the side's opposing vertex) and three angles (usually labeled alpha, beta and gamma according to the closest vertex):

       B                           B
       .                           .
      / \                 c =     /|
   c /   \ a          hypotenuse / | a
    /     \                     /  |
   /_______\                   /___|
 A     b     C               A   b  C
    triangle               right triangle

Right triangle, a triangle with one angle equal to 90 degrees (pi / 2 radians), is an especially important type of triangle. Right triangles are used to define trigonometric functions, such as sine, cosine and tangent, as ratios of side lengths as a function of the triangle angle. For example in a right triangle (as drawn above) it holds that sin(alpha) = a / c.

Similar triangles are triangles that "have the same shape" (but may be of different sizes, positions and rotations). Two triangles are similar if the lengths of corresponding sides have the same ratios, or, equally, if they have the same inside angles. E.g. a triangle with side lengths 1, 2 and 3 is similar to a triangle with side lengths 2, 4 and 6. This fact is very useful in some geometric computations as it can help us determine unknown side lengths.

Equilateral triangle is a triangle whose sides have the same length, which means all its angles are also equal (60 degrees, pi / 3 radians). Equilateral triangles are of course all similar to each other. An isoscele triangle is a triangle with two sides of the same length. We can also distinguish acute and obtuse triangles (obtuse having one angle greater than 90 degrees).

In a triangle there exist two important types of helper line segments: median and altitude. Median goes from a triangle's vertex to the opposite side's center. Altitude goes from a vertex to the opposite side in a perpendicular direction to that side. Each triangle has three medians and three altitudes.

Some basic facts, features and equations regarding triangles are following (beware: many of them only hold in Euclidean geometry):

In non Euclidean ("crazy") geometries triangles behave weird, for example we can draw a triangle with three right angles on a surface of a sphere (i.e. its angles add to more than 180 degrees). This fact can be exploited by inhabitants of a space (e.g. our Universe) to find out if they in fact live in a non Euclidean space (and possibly determine the space's exact curvature).

Constructing triangles: if we are to construct (draw) triangles with only partial knowledge of its parameters, we may exploit the above mentioned attributes to determine things we don't explicitly know. For example if we're told to construct a triangle with knowing only the lengths of the sides but not the angles, we can determine an angle of one side using the law of cosines at which point we can already draw all three vertices and just connect them. In other words just use your brain.

Triangles also play a big role e.g. in realtime 3D rendering where they are used as a basic building block of 3D models, i.e. we approximate more complex shapes with triangles because triangles are simple (thanks to being a simplex) and so have nice properties such as always lying in a plane so we cannot ever see both its front and back side at once. They are relatively easy to draw (rasterize) so once we can draw triangles, we can also draw complex shapes composed of triangles. In general triangles, just as other simple shapes, can be used to approximate measures and attributes -- such as area or center of mass -- of more complex shapes, even outside computer graphics. For example to determine an area of some complex shape we approximate it by triangles, then sum up the areas of the triangles.

Barycentric coordinates provide a coordinate system that can address specific points inside any triangle -- these are used e.g. in computer graphics for texturing. The coordinates are three numbers that always add up to 1, e.g. [0.25, 0.25, 0.5]. The coordinates can be though of as ratios of areas of the three subtriangles the point creates. Points inside the triangle have all three numbers positive. E.g. the coordinates of the vertices A, B and C are [1, 0, 0], [0, 1, 0] and [0, 0, 1], and the coordinates of the triangle center are [1/3, 1/3, 1/3].

Winding of the triangle says whether the ordered vertices of the triangle go clockwise or counterclockwise. I.e. winding says whether if we were to go in the direction A -> B -> C we'd be going clockwise or counterclockwise around the triangle center. This is important e.g. for backface culling in computer graphics (determining which side of a triangle in 3D we are looking at). Determining of the winding of triangle can be derived from the sign of the z-component of the cross product of the triangle's sides. For the lazy: compute w = (y1 - y0) * (x2 - x1) - (x1 - x0) * (y2 - y1), if w > 0 the points go clockwise, if w < 0 the points go counterclockwise, otherwise (w = 0) the points lie on a line.

Sierpinski triangle is a fractal related to triangles.

Testing if point lies inside 2D triangle: one way to do this is following. For each triangle side test whether the winding of the tested point and the side is the same as the winding of whole triangle -- if this doesn't hold for any side, the point is outside the triangle, otherwise it is inside. In order words for each side we are testing whether the tested point and the remaining triangle point are on the same side (in the same half plane). Here is a C code:

int pointIsInTriangle(int px, int py, int tp[6])
{
  // winding of the whole triangle:
  int w = (tp[3] - tp[1]) * (tp[4] - tp[2]) - (tp[2] - tp[0]) * (tp[5] - tp[3]);
  int sign = w > 0 ? 1 : (w < 0 ? -1 : 0);

  for (int i = 0; i < 3; ++i) // test winding of point with each side
  {
    int i1 = 2 * i;
    int i2 = i1 != 4 ? i1 + 2 : 0;

    int w2 = (tp[i1 + 1] - py) * (tp[i2] - tp[i1]) - (tp[i1] - px) * (tp[i2 + 1] - tp[i1 + 1]);
    int sign2 = w2 > 0 ? 1 : (w2 < 0 ? -1 : 0);
      
    if (sign * sign2 == -1) // includes edges
    //if (sign != sign2) // excludes edges
      return 0;
  }
    
  return 1;
}

trollplay

Trollplay

TODO


trom

TROM

{ WIP, still being researched. Was trying to reach the Tio guy but he didn't respond. ~drummyfish }

TROM (The Reality Of Me) is a project running since 2011 that's educating mainly about the harmfulness of trade-based society and tries to show a better way: that of establishing so called trade-free (in the sense of WITHOUT trade) society. It is similar to e.g. Venus Project (with which TROM collaborated for a while), the Zeitgeist Movement and our own LRS -- it shares the common core of opposing money and trade as a basis of society (which they correctly identify as the core issue), even though it differs in some details that we (LRS) don't see as wholly insignificant. The project website is at https://www.tromsite.com/.

TROM was started and is run by a Romanian guy called Tio (born June 12 1988, according to blog), he has a personal blog at https://www.tiotrom.com and now lives mostly in Spain. The project seems more or less run just by him.

The project is funded through Patereon and has created a very impressive volume of very good quality educational materials including books, documentaries, memes, even its own GNU/Linux distro (Tromjaro). The materials TROM creates sometimes educate about general topics (e.g. language), not just those directly related to trade-free society -- Tio says that he simply likes to learn all about the world and then share his knowledge.

The idea behind TROM is very good and similar to LRS, but just as the Venus project it is a bit sus, we align with many core ideas and applaud many things they do, but can't say we 100% support everything about TROM.

Summary

{ WATCH OUT, this is a work in progress sum up based only on my at the moment a rather brief research of the project, I don't yet guarantee this sum up is absolutely correct, that will require me to put in much more time. Read at own risk. ~drummyfish }

good things about TROM:

bad things about TROM:


trump

Donald Trump

"me goin to hav covfefe" --president of the United States

TODO

He is the smartest capitalist of all time, which puts his intellect somewhere between a dolphin and a rat. When written in binary, his IQ goes into triple digits!


trusting_trust

Trusting Trust

In computer security trusting trust refers to the observation (and a type of attack exploiting it) that one cannot trust the technology he didn't create 100% from the ground up; for example even a completely free compiler such as gcc with verifiably non-malicious code, which has been compiled by itself and which is running on 100% free and non-malicious hardware may still contain malicious features if a non-trusted technology was ever involved in the creation of such compiler in the past, because such malicious technology may have inserted a self-replicating malicious code that's hiding and propagating only in the executable binaries. It seemed like this kind of attack was extremely hard to detect and counter, but a method for doing exactly that was presented in 2009 in a PhD thesis called Fully Countering Trusting Trust through Diverse Double-Compiling. The problem was introduced in Ken Thompson's 1984 paper called Reflections on Trusting Trust.

Example: imagine free software has just been invented and there are no free C compilers yet, only a proprietary (potentially malicious) C compiler propC. We decide to write the first ever free C compiler called freeC, in C. freeC code won't contain any malicious features, of course. Once we've written freeC, we have to compile it with something and the only available compiler is the proprietary one, propC. So we have to compile freeC with propC -- doing this, even if freeC source code is completely non-malicious, propC may sneakily insert malicious code (e.g. a backdoor or telemetry) to freeC binary it generates, and it may also insert a self-replicating malicious code into it that will keep replicating into anything this malicious freeC binary will compile. Then even if we compile freeC with the (infected) freeC binary, the malicious self-replicating feature will stay, no matter how many times we recompile freeC by itself. Keep in mind this principle may be used even on very low levels such as that of assembly compilers, and it may be extremely difficult to detect.

For a little retarded people: we can perhaps imagine this with robots creating other robots. Let's say we create plans for a completely nice, non-malicious, well behaved servant robot that can replicate itself (create new nice behaving robots). However someone has to make the first robot -- if we let some potentially evil robot make the first "nice" robot according to our plans, the malicious robot can add a little malicious feature to this othrwise "nice" robot, e.g. that he will spy on its owner, and he can also make that "nice" robot make pass this feature to other robots he makes. So unless we make our first nice robot by hand, it's very hard to know whether our nice robots don't in fact posses malicious features.


turing_machine

Turing Machine

Turing machine is a mathematical model of a computer which works in a quite simple way but has nevertheless the full computational power that's possible to be achieved. Turing machine is one of the most important tools of theoretical computer science as it presents a basic model of computation (i.e. a mathematical system capable of performing general mathematical calculations) for studying computers and algorithms -- in fact it stood at the beginning of theoretical computer science when Alan Turing invented it in 1936 and used it to mathematically prove essential things about computers; for example that their computational power is inevitably limited (see computability) -- he showed that even though Turing machine has the full computational power we can hope for, there exist problems it is incapable of solving (and so will be any other computer equivalent to Turing machine, even human brain). Since then many other so called Turing complete systems (systems with the exact same computational power as a Turing machine) have been invented and discovered, such as lambda calculus or Petri nets, however Turing machine still remains not just relevant, but probably of greatest importance, not only historically, but also because it is similar to physical computers in the way it works.

The advantage of a Turing machine is that it's firstly very simple (it's basically a finite state automaton operating on a memory tape), so it can be mathematically grasped very easily, and secondly it is, unlike many other systems of computations, actually similar to real computers in principle, mainly by its sequential instruction execution and possession of an explicit memory tape it operates on (equivalent to RAM in traditional computers). However note that a pure Turing machine cannot exist in reality because there can never exist a computer with infinite amount of memory which Turing machine possesses; computers that can physically exist are really equivalent to finite state automata, i.e. the "weakest" kind of systems of computation. However we can see our physical computers as approximations of a Turing machine that in most relevant cases behave the same, so we do tend to theoretically view computers as "Turing machines with limited memory".

Is there anything computationally more powerful than a Turing machine? Well, yes, but it's just kind of "mathematical fantasy". See e.g. oracle machine which adds a special "oracle" device to a Turing machine to make it magically solve undecidable problems.

How It Works

Turing machine has a few different versions (such as one with multiple memory tapes, memory tape unlimited in both directions, one with non-determinism, ones with differently defined halting conditions etc.), which are however equivalent in computing power, so here we will describe just one of the most common variants.

A Turing machine is composed of:

The machine halts either when it reaches the end state, when it tries to leave the tape (go left from memory cell 0) or when it encounters a situation for which it has no defined rule.

The computation works like this: the input data we want to process (for example a string we want to reverse) are stored in the memory before we run the machine. Then we run the machine and wait until it finishes, then we take what's present in the memory as the machine's output (i.e. for example the reversed string). That is a Turing machine doesn't have a traditional I/O (such as a "printf" function), it only works entirely on data in memory!

Let's see a simple example: we will program a Turing machine that takes a binary number on its output and adds 1 to it (for simplicity we suppose a fixed number of bits so an overflow may happen). Let us therefore suppose symbols 0 and 1 as the tape alphabet. The control unit will have the following rules:

stateFrom inputSymbol stateTo outputSymbol headShift
goRight non-blank goRight inputSymbol right
goRight blank add1 blank left
add1 0 add0 1 left
add1 1 add1 0 left
add0 0 add0 0 left
add0 1 add0 1 left
end

Our start state will be goRight and end will be the end state, though we won't need the end state as our machine will always halt by leaving the tape. The states are made so as to first make the machine step by cells to the right until it finds the blank symbol, then it will step one step left and switch to the adding mode. Adding works just as we are used to, with potentially carrying 1s over to the highest orders etc.

Now let us try inputting the binary number 0101 (5 in decimal) to the machine: this means we will write the number to the tape and run the machine as so:

goRight
  _V_ ___ ___ ___ ___ ___ ___ ___
 | 0 | 1 | 0 | 1 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
    goRight
  ___ _V_ ___ ___ ___ ___ ___ ___
 | 0 | 1 | 0 | 1 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
        goRight
  ___ ___ _V_ ___ ___ ___ ___ ___
 | 0 | 1 | 0 | 1 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
            goRight
  ___ ___ ___ _V_ ___ ___ ___ ___
 | 0 | 1 | 0 | 1 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
                goRight
  ___ ___ ___ ___ _V_ ___ ___ ___
 | 0 | 1 | 0 | 1 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
              add1
  ___ ___ ___ _V_ ___ ___ ___ ___
 | 0 | 1 | 0 | 1 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
          add1
  ___ ___ _V_ ___ ___ ___ ___ ___
 | 0 | 1 | 0 | 0 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
      add0
  ___ _V_ ___ ___ ___ ___ ___ ___
 | 0 | 1 | 1 | 0 |   |   |   |    ...
 '---'---'---'---'---'---'---'---
  add0
  _V_ ___ ___ ___ ___ ___ ___ ___
 | 0 | 1 | 1 | 0 |   |   |   |    ...
 '---'---'---'---'---'---'---'---

 END

Indeed, we see the number we got at the output is 0110 (6 in decimal, i.e. 5 + 1). Even though this way of programming is very tedious, it actually allows us to program everything that is possible to be programmed, even whole operating systems, neural networks, games such as Doom and so on. Here is C code that simulates the above shown Turing machine with the same input:

#include <stdio.h>

#define CELLS 2048       // ideal Turing machine would have an infinite tape...
#define BLANK 0xff       // blank tape symbol
#define STATE_END   0xff
#define SHIFT_NONE  0
#define SHIFT_LEFT  1
#define SHIFT_RIGHT 2

unsigned int state;            // 0 = start state, 0xffff = end state    
unsigned int headPosition;
unsigned char tape[CELLS];     // memory tape

unsigned char input[] =        // what to put on the tape at start
  { 0, 1, 0, 1 };

unsigned char rules[] =
{
// state symbol newstate newsymbol shift
   0,    0,     0,       0,        SHIFT_RIGHT, // moving right
   0,    1,     0,       1,        SHIFT_RIGHT, // moving right
   0,    BLANK, 1,       BLANK,    SHIFT_LEFT,  // moved right
   1,    0,     2,       1,        SHIFT_LEFT,  // add 1
   1,    1,     1,       0,        SHIFT_LEFT,  // add 1
   2,    0,     2,       0,        SHIFT_LEFT,  // add 0
   2,    1,     2,       1,        SHIFT_LEFT   // add 0
};

void init(void)
{
  state = 0;
  headPosition = 0;

  for (unsigned int i = 0; i < CELLS; ++i)
    tape[i] = i < sizeof(input) ? input[i] : BLANK;
}

void print(void)
{
  printf("state %d, tape: ",state);

  for (unsigned int i = 0; i < 32; ++i)
    printf("%c%c",tape[i] != BLANK ? '0' + tape[i] : '.',i == headPosition ?
      '<' : ' ');

  putchar('\n');
}

// Returns 1 if running, 0 if halted.
unsigned char step(void)
{
  const unsigned char *rule = rules;

  for (unsigned int i = 0; i < sizeof(rules) / 5; ++i)
  {
    if (rule[0] == state && rule[1] == tape[headPosition]) // rule matches?
    {
      state = rule[2];
      tape[headPosition] = rule[3];

      if (rule[4] == SHIFT_LEFT)
      {
        if (headPosition == 0)
          return 0;  // trying to shift below cell 0
        else
          headPosition--;
      }
      else if (rule[4] == SHIFT_RIGHT)
        headPosition++;

      return state != STATE_END;
    }

    rule += 5;
  }

  return 0;
}

int main(void)
{
  init();

  print();

  while (step())
    print();

  puts("halted");

  return 0;
}

And here is the program's output:

state 0, tape: 0<1 0 1 . . . . . . . . . . . . . . . . .
state 0, tape: 0 1<0 1 . . . . . . . . . . . . . . . . .
state 0, tape: 0 1 0<1 . . . . . . . . . . . . . . . . .
state 0, tape: 0 1 0 1<. . . . . . . . . . . . . . . . .
state 0, tape: 0 1 0 1 .<. . . . . . . . . . . . . . . .
state 1, tape: 0 1 0 1<. . . . . . . . . . . . . . . . .
state 1, tape: 0 1 0<0 . . . . . . . . . . . . . . . . .
state 2, tape: 0 1<1 0 . . . . . . . . . . . . . . . . .
state 2, tape: 0<1 1 0 . . . . . . . . . . . . . . . . .
halted

Universal Turing machine is an extremely important type of Turing machine: one that is able to simulate another Turing machine -- we can see it as a Turing machine interpreter of a Turing machine. The Turing machine that's to be simulated is encoded into a string (which can then be seen as a programming language -- the format of the string can vary, but it somehow has to encode the rules of the control unit) and this string, along with an input to the simulated machine, is passed to the universal machine which executes it. This is important because now we can see Turing machines themselves as programs and we may use Turing machines to analyze other Turing machines, to become self hosted etc. It opens up a huge world of possibilities.

Non-deterministic Turing machine is a modification of Turing machine which removes the limitation of determinism, i.e. which allows for having multiple different "conflicting" rules defined for the same combination of state and input. During execution such machine can conveniently choose which of these rules to follow, or, imagined differently, we may see the machine as executing all possible computations in parallel and then retroactively leaving in place only the most convenient path (e.g. that which was fastest or the one which finished without getting stuck in an infinite loop). Surprisingly a non-deterministic Turing machine is computationally equivalent to a deterministic Turing machine, though of course a non-deterministic machine may be faster (see especially P vs NP).

Turing machines can be used to define computable formal languages. Let's say we want to define language L (which may be anything such as a programming language) -- we may do it by programming a Turing machine that takes on its input a string (a word) and outputs "yes" if that string belongs to the language, or "no" if it doesn't. This is again useful for the theory of decidability/computability.

See Also


twos_complement

Two's Complement

Two's complement is an elegant way of encoding signed (i.e. potentially negative) integer (and possibly also fixed point) numbers in binary. It is one of the most basic concepts to know as a programmer; for its simplicity and nice properties two's complement is the way that's used to represent binary integers almost everywhere nowadays. Other ways of encoding singed numbers, mainly sign-magnitude and one's complement, are basically always inferior.

Why is two's complement so great? Its most notable advantages are:

TODO: disadvantages?

N bit number in two's complement can represent numbers from -(2^N) / 2 to 2^N / 2 - 1 (including both). For example with 8 bits we can represent numbers from -128 to 127.

How does it work? EZ: the highest (leftmost) bit represents the sign: 0 is positive (or zero), 1 is negative. To negate a number negate all its bits and add 1 to the result (with possible overflow). (There is one exception: negating the smallest possible negative number will give the same number as its positive value cannot be represented.)

In other words given N bits, the positive values representable by two's complement with this bit width are the same as in normal unsigned representation and any representable negative value -x corresponds to the value 2^N - x.

Example: let's say we have a 4 bit number 0010 (2). It is positive because the leftmost bit is 0 and we know it represents 2 because positive numbers are the same as the straightforward representation. To get number -2, i.e. multiply our number by -1, we negate the number, which gives 1101, and add 1, which gives 1110. We see by the highest 1 bit that this number is negative, as we expected. As an exercise you may try to negate this number back and see we obtain the original number. Let's just now try adding 2; we expect that adding 2 to -2 will give 0. Sure enough, 1110 + 0010 = 0000. Etcetc. :)

The following is a comparison of the different representations, notice the shining superiority of two's complement:

value unsigned two's com. sign-mag. one's com.
000 0 0 0 0
001 1 1 1 1
010 2 2 2 2
011 3 3 3 3
100 4 -4 -0 -3
101 5 -3 -1 -2
110 6 -2 -2 -1
111 7 -1 -3 -0

ubi

Universal Basic Income

Universal basic income (UBI) is the idea that all people should get regular pay from the state without any conditions, i.e. even if they don't work, even if they are rich, criminals etc. It is a great idea and we fully support it as the first step towards the ideal society in which people don't have to work. As automation takes away more and more jobs, it is being discussed more and experiments with UBI are being conducted, even though capitalist idiots rather try to invent more and more bullshit jobs to keep people enslaved.

UBI that itself covers all basic needs is called full, otherwise it is called partial. UBI subscribes to the idea that the goal of progress is to eliminate the need for work, which is correct -- we should leave all work to machines eventually, that's why we started civilization. This doesn't mean we can't work, just that we aren't obliged.

The first reaction of a noob hearing about UBI is "but everyone will just stop working!" Well no, for a number of reasons (which have been confirmed by real life experiments). For example most people don't want to just survive, they want to buy nice things and have something extra, so most people will want to get some additional income. Secondly people do want to work -- work in the sense of doing something meaningful. If they don't have to be wage slaves, most will decide to dedicate their free time to doing something useful. Thirdly people are already used to working, most will keep doing it just out of inertia, e.g. because they have friends at work or simply because they actually happen to like going there.

Another question of the noob is "but who will pay for it?!" Well, we all and especially the rich. In current situation, even if we make the rich give away 90% of their wealth, they won't even notice.

Of course, UBI works with money and money is bad, however the core idea is simply about sharing resources, i.e. true communism which surpasses the concept of money. That is once money is eliminated, the idea of UBI will stay as the right of everyone to get things such as food and health care unconditionally. LRS supports UBI as the immediate next step towards making money less important and eventually eliminating them altogether.

Advantages of UBI:

Disadvantages of UBI:


ui

User Interface

User interface, or UI, is an interface between human and computer. Such interface interacts with some of the human senses, i.e. it can be visual (text, images), auditory (sound), touch (haptic) etc.

Remember the following inequality:

non-interactive CLI > interactive CLI > TUI > GUI

Some faggots make living just by designing interfaces without even knowing programming lmao. They call it "user experience" or UX. We call it a bullshit.


unary

Unary

Unary generally refers to having "one of a thing". In different contexts it may specifically signify e.g.:


unicode

Unicode

".̶̧̦̼̱͙̥̲̝̥̭͍̪̈́̍͌̋͑͒̒̅͂͒̀̊̓̕͠.̴̛̰̯͚͚̳͍̞̯̯͊̒͂̌̃̎̒̐̅͗́̂͠͝.̸̜̀͊̀̒̾̐̔͑̚̕͠a̲̬̪͙̖̬̖ͭͫͦ̀̄̆̍ͦͨͦ͗̅͋ͦͤͯͫ̔̚l̫̹̺̭̳͙̠̦͍̫̝͓͙̟̺͗̊̅ͬ̉͒̏͆͗͒̋ͤ̆̆ͥg̥̳̗͕̫ͪ͛̓̂ͫͮ̔͌̃̈͒̔̏ͭ͋͋ ⃝꙰⃝꙰⃝꙰⃝꙰⃝꙰⃝꙰⃝꙰⃝꙰⃝꙰⃝á́́́́́́́́́́́́́́́́́́́́́́́́́́́́́.̶̢̙̺̖͔̱͎̳̭̖̗̲̻̪̻͑̌͒̊̃̈̾̿̓̅̐́̀̋̔̏.̴̺͖͎͚̠̱̤͂̈́͜.̵̡̡͖̪̱̼͕̘̣̠̮̫͓̯͖̜̚͝͝͝.̷̧̨̥̦̥̱͉̼̗̰̪͍̱͎̑̾Z̳͍̩̻̙̪̲̞̭̦̙ͯ͛̂ͥͣͪͅͅͅl̷̢̛̩̰̹͔̣͗̅̇̍̏͑͐̇̋̑͜ͅǫ̶̢̫̟͙̖̩̽̀͆̽͌͘l̶̩̞̖̹͈͒͊̔̑̆�̸͎̺̳̄͂̊̒�̶̸̵̶̴̸̸̴̶̸̷̶̴̴̡̢̡̢̡̢̧̧̡̧̡̨̡̨̢̧̧̡̢̛̛̛̛̼̻̣̗͔͉̩̪̞͎̖̙͍͚͍̼̰͖̺̤̗̘͕̳̻̖̳̻̗̯̭̙̳̲͕̮͇͕̼͉̞̣̟̖̘̟͕̗̼̙̻͇̝̪̦͚̤̦̣̗̤̪̟̠͖͓̟̬̲͙͇͉̘͙͙͚̜̜̮͈̞͓̰̫͍̙͙͙̱͓͖̠͇̪̭̮̤̺̗̙̘̫̤̥̳͇͔̣̩͕͍̦͈̬̯̗̘͔̻̗̘͔̪̹̬̲͇͕̻͎̣̩̻̖͉̱̝̼̞̪̠̮̤͓̥͊̔̈́̀̋̄̄̇́̋̎͛̓́̔̇̂̒̅͊̎̉͗̓̀͑̋͒͑̍̏̅̋͆̑̈̾͗̽͑̏̉̀͌͋̉̒̋̑̊̂̈́̈́͑̀͂́̈́̆̄̃͆͆̈́̊̿̌̋̍̈̒͂̀̈́͌̽͌̈́͋̈́̃̅͂͆́̍͑̓̎͋̅͂̽̈́̈́͗̆̑̔̎̈́́̆͂̉̀̒͌̿̽͊̍̃̕̚͘̚̕̚͘̕̕̕͘͜͜͝͠͝͝͠͝͝͝͠͝͠͠͠ͅͅͅͅ.̸̷̷̷̴̸̶̵̴̶̵̸̴̴̷̸̷̵̷̵̴̴̷̧̨̢̨̡̨̧̡̨̧̧̨̡̧̢̧̢̧̨̛̛̛̛̛̛̛̤͈̯̤͙̻̫̼̱̦̮̙̤̝̖̗͉̘̫̟̗̹͉͇͖̘͙̻̫̫̫̰̝̭̤͈͓͔̱̭͙͔͔̼̖̬̰̳̗͖͖̯̮͔̝̞̬̳͇͈̥̘͙͇̺̪̞̞̙͈̮͔̞̭͎̩͎̦̞̝͎̗͚͈̖̣͖̹̜̞̤̺̱̱̰͔̼̭̮̰̖͔͔͈̥͎̜̭̪̺̲͔̲̻̰̳̲̖̤̳̙̥̼̩͈̥̗̟͙̥̗̳͍̥̝̫͚̘̱̱̹̺̣̝̳̣͇̹̫̝̫̟̯̺͇̞̳͖̫͔̲̗͔̟̩̦̳͎̳͖̎̓͂̀̀́̌͗̐̅̈́̓̿̓̌́̓́͋͊͛̄͊̂̒͌̀͗̔̀̑̔͒̐̀͌̋̍͗͛̂̆̈́͛͋͆̐̌̓̄͊̑̑̅̑̿̏̈́̀̊̆̈̔̃̽̀̎̐́̎̾͐̀̌̒̑́̇̑̊͑́̓̓̔̆͐́̅̓̔̃̅̂̐͗́̎͌́̊͌͒͒̓́̀͒̍̽̂́̀̉̀̑̉̑̓́͗̓́̍̏̉͆̑͂̔̅̀͊̈́̀͑͛́̿͆͑̀͐̃̋̐̋̈́̉͊̿̌̾͗͛̉́̓̓̏̈́͂̋͌͆̓̑͗͗̍̇̕̚̚͘̕͘̚̚̕͘͘͜͜͜͜͜͜͜͜͠͠͠͝͝͠͝͠͝͠͝ͅͅͅͅͅͅ.̸̷̸̴̸̸̶̶̵̵̸̵̴̡̡̡̡̧̢̢̧̧̧̧̡̢̡̛̛̛̛̬͇̜̘̗̗̲̟̗̤̤̜̹͎̣̹̺͉̯̼̭̟̮̖͕̻̰̬̼̮̮̬̪̥̤̘̣̺̥̪̠̥̳̰͇̫͔̜̫͚͖͔̩̙̪͖̥͍̗͍͉͙̣͔̠̭̞̩̱̠̻̹͎͔̯̻̘͖̦̘͕͉͈͈̞̖̬͔͈̗͓͖͚̤̬̤̘̠̱͆̍̍͆͗͋̇͗̓͐̉͋̈́̀̍̈̇̀̀̎͋̾̇̎͐̌̌̿̽̾̃̑͆̎̾̾̈́̆̐̂̅́̓̔̇̔̑̔͑̓̍͊͌͋̔̐̑͌̓̒̎̍̃͐̀͊̿̓͋̌͐̋̂̽̿̒̋̎́͒̋͘͘͘̕̕͘͝͠͝͝ͅͅa̲̬̪͙̖̬̖ͭͫͦ̀̄̆̍ͦͨͦ͗̅͋ͦͤͯͫ̔̚l̫̹̺̭̳͙̠̦͍̫̝͓͙̟̺͗̊̅ͬ̉͒̏͆͗͒̋ͤ̆̆ͥ𒈙.̴̢̟̩̗͊.̴̹͎̦̘͇͎̩̮̻̾͛̐ͅ𰻞.̷̧̫͙̤̗͇̔̂̀̄͗̍̈͋̈́̕.̷̨̛͈̤͈̲̥̱̹̲͖͗͛͆̓͊̅̈̕͠.̷̻̺͔͍̭͋̾̐̔͑̔̌̂͛͆̽͘͜͠͝͠.̷̧̨͉̝̳̲̫̙̻͎̬͚̒̀̄͒.̶̨͙̩̦̪͋̄͆͌̈́́͐̈̈́̕ͅ.̸̡̠̙̪͔͍̬̘̖̗̙̞̬͇̐͋͊͐̋̚ͅ.̷̢̮̮̖̹̟̖̩̗͙̝̺́̑̈̉͘͘͠ͅ.̴̨̡̧̤̳͖̰̼̺̮͉͖̲̫̳̜̹̄.̵̢̤̦̞͙̝̬͍̞̤͇̽̾̈́̔̋̋̓̌̋̐̓̅͜͝.̷͙͊.̵̠̜̞̭̘͉͓̞̤͍̝̈́̋̃́̈́͐̃̉͆̚͜.̴͉͈͓͈͉͎̺͍͕̥̦̙͙͕̈́̏̿́̏̔.̶͕̟̤͔͑̉̽̓̇̐́̃̿͜.̶̧̨̨̱̪̞̞̯̹̤̘̭̠͓̀̓̐̓́͑͂̉.̴̛̙̮͚̊͗̏̈́͗̅͆̑̂̌̐̃̊̂̓.̴̙͎̔͑̿͗̃̒́̏̏͑͘̕á́́́́́́́́́́́́́́́́́́́́́́́́́́́́́" --creator of 🎮𝕌𝕟ι𝕔𝗼d̢̪̲̬̳̩̟̍ĕ̸͓̼͙͈͐🚀

TODO

In Unicode every character is unique like a unicorn.


universe

Universe

Universe is everything that exists, it is the spacetime and everything in it, all matter and energy as well as all the laws of nature by which it behaves. The size of the whole universe is not known, it is possibly infinite, however the size of the observable universe -- the part of it with which we can ever interact due to limited speed of light combined with constant expansion of space -- is about 93 billion light years in diameter, and contains an estimated number of 100 billion galaxies, each containing hundreds of billions of stars. Current science says the universe is about 13.7 billion years old and that it started with the Big Bang, a point in time from which everything began to rapidly expand from a single point in space.

Computers can be used to simulate certain parts of the universe, in fact all programs mimic the universe in some kind of simplified way, be it scientific simulations of planet collisions, government databases or games -- they all more or less accurately model the reality. We can also see computers as a way of creating smaller universes, which leads many to think our universe may itself be a simulation running on some computer in a "bigger" universe (note that this probably isn't testable and the debate isn't scientific, but we can lead philosophical discussions about it).


unix

Unix

"Those who don't know Unix are doomed to reinvent it, poorly." --obligatory quote by Henry Spencer

Unix is an old operating system developed since 1960s as a research project of Bell Labs, which has become one of the most influential pieces of software in history and whose principles (e.g. the Unix philosophy) live on in many so called Unix-like operating systems such as Linux and BSD (at least to some degree). The original system itself is no longer in use, the name UNIX is nowadays a trademark and a certification. However, as someone once said, Unix is not so much an operating system as a way of thinking.

Unix has reached the highest level a software can reach: it has transcended its implementation and became a de facto standard. This means it has become a set of interface conventions, cultural and philosophical ideas rather than being a single system, it lives on as a concept that has many implementations. This is extremely important as we don't depend on any single Unix implementation but we have a great variety of choice between which we can switch without greater issues. This is very important for freedom -- it prevents monopolization -- and its one of the important reasons to use unix-like systems.

History

In the 1960s, Bell Labs along with other groups were developing Multics, a kind of operating system -- however the project failed and was abandoned for its complexity and expensiveness of development. In 1969 two Multics developers, Ken Thompson and Dennis Ritchie, then started to create their own system, this time with a different philosophy; that of simplicity (see Unix philosophy). They weren't alone in developing the system, a number of other hackers helped program such things as a file system, shell and simple utility programs. At VCF East 2019 Thompson said that they developed Unix as a working system in three weeks. At this point Unix was written in assembly.

In the early 1970s the system got funding as well as its name Unix (a pun on Multix). By now Thompson and Richie were developing a new language for Unix which would eventually become the C language. In version 4 (1973) Unix was rewritten in C.

Unix then started being sold commercially. This led to its fragmentation into different versions such as the BSD or Solaris. In 1983 a version called System V was released which would become one of the most successful. The fragmentation and a lack of a unified standard led to so called Unix Wars in the late 1980s, which led to a few Unix standards such as POSIX and Single Unix Specification.

For zoomers and other noobs: Unix wasn't like Windows, it was more like DOS, things were done in text interface -- if you use the command line in "Linux" nowadays, you'll get an idea of what it was like, except it was all even more primitive. Things we take for granted such as a mouse, copy-pastes, interactive text editors, having multiple user accounts or running multiple programs at once were either non-existent or advanced features in the early days. Anything these guys did you have to see as done with stone tools.


unix_philosophy

Unix Philosophy

Unix philosophy is one of the most important and essential approaches to programming which advocates great minimalism and is best known by the saying that a program should only do one thing and do it well. Unix philosophy is a collective wisdom, a set of recommendations evolved during the development of one of the earliest operating systems called Unix, hence the name. Unix philosophy advocates simplicity, clarity, modularity, reusability and composition of larger programs out of smaller programs rather than designing huge monolithic programs as a whole. Unix philosophy, at least partially, lives on in many project and Unix-like operating systems such as Linux (though Linux is more and more distancing from Unix), has been wholly adopted by groups such as suckless and LRS (us), and is even being expanded in such projects as plan9.

In 1978 Douglas McIlroy has written a short overview of the Unix system (UNIX Time-Sharing System) in which he gives the main points of the system's style; this can be seen as a summary of the Unix philosophy (the following is paraphrased):

  1. Each program should do one thing and do it well. Overcomplicating existing programs isn't good; for new functionality create a new program.
  2. Output of a program should be easy to interpret by another program. In Unix programs are chained by so called pipes in which one program sends its output as an input to another, so a programmer should bear this in mind. Interactive programs should be avoided if possible.
  3. Program so that you can test early, don't be afraid to throw away code and rewrite it from scratch.
  4. Write and use tools, even if they're short-lived, they're better than manual work. Unix-like systems are known for their high scriptability.

This has later been condensed into: do one thing well, write programs to work together, make programs communicate via text streams, a universal interface.

Example: maybe the most common practical example that can be given is piping small command line utility programs; in Unix there exist a number of small programs that do only one thing but do it well, for example the cat program that only displays the content of a file, the grep program that searches for patterns in text etc. In a command line we may use so called pipes to chain some of these simple programs into more complex processing pipelines. Let's say we want to for example automatically list all first and second level headings on given webpage and write them out alphabetically sorted. We can do it with a command such as this one:

curl "https://www.tastyfish.cz/lrs/main.html" | grep "<h[12]>.*</h[12]>" | sed "s/[^>]*> *\([^<]*\) *<.*/\1/g" | sort

Which may output for example:

Are You A Noob?
Some Interesting Topics
Welcome To The Less Retarded Wiki
What Is Less Retarded Software

In the command the pipes (|) chain multiple programs together so that the output of one becomes the input of the next. The first command, curl, downloads the HTML content of the webpage and passes it to the second command, grep, which filters the text and only prints lines with headings, this is passed to sed that removes the HTML code and the result is passed to sort that sorts the lines alphabetically -- as this is the last command, the result is then printed. This is fast, powerful and very flexible way of processing data for anyone who knows the Unix tools. Notice the relative simplicity of each command and how each works with text -- the universal communication interface.

Compare this to the opposite Window philosophy in which combining programs into collaborating units is not intended or even purposefully prevented, and therefore very difficult, slow and impractical to do -- such programs are designed for manually performing some predefined actions, e.g. painting pictures with a mouse, but aren't made to collaborate or be automatized, they can rarely be used in unintended, inventive ways needed for hacking.

Watch out! Do not misunderstand Unix philosophy. There are many extremely dangerous cases of misunderstanding Unix philosophy by modern wannabe programmers. One example is the hilarious myth about "React following Unix philosophy" (LMAO this), supposedly the "devs" think that having billion of dependencies or focusing on doing one huge thing (GUI) somehow implies Unix philosophy -- nothing based on JavaScript can ever follow Unix philosophy! Unix philosophy can NOT be built on top of non-unix philosophy technology, and focusing on a very broad goal does not mean doing one thing.

{ One possible practical interpretation of Unix philosophy I came up with is this: there's an upper but also lower limit on complexity. "Do one thing" means the program shouldn't be too complex, we can simplify this to e.g. "Your program shouldn't surpass 10 KLOC". "Do it well" means the programs shouldn't bee too trivial because then it is hardly doing it well, we could e.g. say "Your program shouldn't be shorter than 10 LOC". E.g. we shouldn't literally make a separate program for printing each ASCII symbol, such programs would be too simple and not doing a thing well. We rather make a cat program, that's neither too complex nor too trivial, which can really print any ASCII symbol. By this point of view Unix philosophy is really about balance of triviality and huge complexity, but hints that the right balance tends to be much closer to the triviality than we humans are tempted to intuitively choose. Without guidance we tend to make programs too complex and so the philosophy exists to remind us to force ourselves to rather minimize our programs to strike the correct balance. ~drummyfish }

See Also


unretard

Unretard

Unretarding means aligning oneself with the less retarded software and society after acquiring necessary education and mindset. This wiki should help achieve this goal.

See Also


update_culture

Update Culture

Update culture is a malicious mindset emerging in a capitalist society which in technology manifests by developers of a (typically bloated) program creating frequent modifications called "updates" and forcing users to keep consuming them, e.g. by deprecating or neglecting old versions, dropping backwards compatibility (e.g. Python) or by downright forcing updates in code. This often manifests by a familiar pop-up message:

"Your software is too old, please update to the latest version."

In software this process is a lot of times automatized and known as autoupdates, but update culture encompasses more than this, it's the whole mentality of having to constantly keep up, update one's software, hardware and other products, it is part of fear culture, bullshit and consumerism. Normies get all neurotic when they haven't received their weekly updates that give them new content or fake sense of "security". The truth is updates break more things that they fix and make software progressively shittier. STOP FUCKING UPDATING EVERYTHING EVERY 3 SECONDS YOU IDIOTS. Good software is written once and works for hundreds of years without maintenance.

A typical example falling under update culture are web browsers or proprietary operating systems that strive for bloat monopoly.

Update culture is however not limited to computers or technology, hell no. It is the mood of the whole society and applies to things such as fashion, business, gossip, watching TV news every day, browsing social media or constantly updating laws, it is the acceptance and approval of living in a constant stress of having to extort extreme amounts of energy just to keep up with artificially made up bullshit, to race against oneself and others in a never ending artificially sustained race with no winners, just with extremely exhausted participants. Current system of law requires constant everyday maintenance that's extremely costly, law needs to be constantly remade and rewritten to reflect any emerging trend in society because it is so unbelievably complex and tries to encompass every single aspect of our society. Of course we eventually oppose any kind of formal law, however the kind of update culture law is yet orders of magnitude worse -- if we see law as a tool to serve society, this kind of law is an utterly shitty tool similar to a hammer that has to be repaired every second just to keep functioning.

Software updates are usually justified by "muh security" and "muh modern features". Users who want to avoid these updates or simply can't install them, e.g. due to using old incompatible hardware or missing dependency packages, are ridiculed as poorfags, idiots and their suffering is ignored. In fact, update culture is cancer because:


usa

USA

United States of America (USA, US or just "murika") is a dystopian country of fat, stupid fascist people enslaved by capitalism, people endlessly obsessed with things such as money, wars, shooting their presidents and shooting up their schools. USA consists of 50 states located in North America, a continent that ancestors of Americans invaded and have stolen from Indians, the natives whom Americans mass murdered.

USA is very similar to North Korea: in both countries the people are successfully led to believe their country is the best and have strong propaganda based on cults of personality, which to outsiders seem very ridiculous but which is nevertheless very effective: for example North Korea officially proclaims their supreme leader Jong-il was born atop a sacred mountain and a new star came to exist in the sky on the day of his birth, while Americans on the other hand believe their fascist president George Washington was divine and PHYSICALLY UNABLE TO TELL A LIE, which was actually taught at their schools. North Korea is ruled by a single political party, US is ruled by two practically same militant capitalist imperialist parties (democrats and republicans), i.e. de-facto one party as well. Both countries are obsessed with weapons and their military, both are highly and openly fascist (nationalist). Both countries are full of extreme propaganda, censorship and hero culture, people worship dictators such as Kim Jong-un or Steve Jobs. US is even worse than North Korea because it exports its toxic culture all over the whole world and constantly invades other countries, it is destroying all other cultures and leads the whole world to doom and destruction of all life, while North Korea basically only destroys itself.

In US mainstream politics there exists no true left, only right and pseudoleft. It is only in extreme underground, out of the sight of most people, where occasionally something good comes into existence as an exception to a general rule that nothing good comes from the US. One of these exceptions is free software (established by Richard Stallman) which was however quickly smothered by the capitalist open source counter movement.

On 6th and 9th August 1945 USA casually killed about 200000 civilians, most of whom were innocent men, women and children, by throwing atomic bombs on Japanese cities Hiroshima and Nagasaki. The men who threw the bombs and ordered the bombing were never put on trial, actually most Americans praise them as heroes and think it was a good thing to do.


used

Used

Used is someone using technology that abuses him, most notably proprietary software. For example those using Windows are not users but useds. This term was popularized by Richard Stallman.


usenet

Usenet

Usenet (User's Network) is an ancient digital discussion network -- a forum -- that existed long before the World Wide Web. At the time it was very popular, it was THE place to be, but nowadays it's been forgotten by the mainstream, sadly hardly anyone remembers it.

Back in the day there were no web browsers, there was no web. Many users were also not connected through Internet as it was expensive, they normally used other networks like UUCP working through phone lines. They could communicate by some forms of electronic mail or by directly connecting to servers and leaving messages for others there -- these servers were called BBSes and were another popular kind of "social network" at the time. Usenet was a bit different as it was decentralized -- it wasn't stored or managed on a single server, but on many independent servers that provided users with access to the network. This access was (and is) mostly paid (to lurk for free you can search for Usenet archives online). To access Usenet a newsreader program was needed, it was kind of a precursor to web browsers (nowadays newsreaders are sometimes built into e.g. email clients). Usenet was lots of time not moderated and anonymous, i.e. kind of free, you could find all kinds of illegal material there.

Usenet invented many things that survive until today such as the words spam and FAQ as well as some basic concepts of how discussion forums even work.

Usenet was originally ASCII only, but people started to post binary files encoded as ASCII and there were dedicated sections just for posting binaries, so you co go piiiiiiiiirating.

It worked like this: there were a number of Usenet servers that all collaborated on keeping a database of articles that users posted (very roughly this is similar to how blockchain works nowadays); the servers would more or less mirror each other's content. These servers were called providers as they also allowed access to Usenet but this was usually for a fee. The system uses a NNTP (Network News Transfer Protocol) protocol. The articles users posted were also called posts or news, they were in plain text and were similar to email messages. Other users could reply to posts, creating a discussion thread. Every post was also categorized under certain newsgroup that formed a hierarchy (e.g. comp.lang.java). After so called Big Renaming in 1987 the system eventually settled on 8 top level hierarchies (called the Big 8): comp.* (computers), news.* (news), sci.* (science), rec.* (recreation), soc.* (social), talk.* (talk), misc.* (other) and humanities.* (humanities). There was also another one called alt.* for "controversial" topics (see especially alt.tasteless). According to Jargon File, by 1996 there was over 10000 different newsgroups.

Usenet was the pre-web web, kind of like an 80s reddit which contained huge amounts of historical information and countless discussions of true computer nerds which are however not easily accessible anymore as there aren't so many archives, they aren't well indexed and direct Usenet access is normally paid. It's a shame. It is possible to find e.g. initial reactions to the AIDS disease, people asking what the Internet was, people discussing future technologies, the German cannibal (Meiwes) looking for someone to eat (which he eventually did), Bezos looking for Amazon programmers, a heated debate between Linus Torvalds and Andrew Tanenbaum about the best OS architecture (the "Linux is obsolete" discussion) or Douglas Adams talking to his fans. There were memes and characters like BIFF, a kind of hilarious noob wannabe cool personality. Some users became kind of famous, e.g. Scott Abraham who was banned from Usenet by court after an extremely long flame war, Alexander Abian, a mathematician who argued for blowing up the Moon (which according to his theory would solve all Earth's issues), Archimedes Plutonium who suggested the Universe was really a big plutonium atom (he later literally changed his name to Plutonium lol) or John Titor (pretend time traveler). There are also some politically incorrect groups like alt.niggers lol.

{ I mean I don't remember it either, I'm not that old, I've just been digging on the Internet and in the archives, and I find it all fascinating. ~drummyfish }

Where To Freely Browse Usenet

Search for Usenet archives, I've found some sites dedicated to this, also Internet archive has some newsgroups archived. Google has Usenet archives on a website called "Google groups" (now sadly requires login). There is a nice archive at https://www.usenetarchives.com. Possibly guys from Archive Team can help (https://wiki.archiveteam.org/index.php/Usenet, https://archive.org/details/usenet, ...). See also http://www.eternal-september.org.

See Also


vector

Vector

Vector is a basic mathematical object that expresses direction and magnitude (such as velocity, force etc.) and is usually expressed as an "array of numbers". For example in two dimensional space an array [4,3] expresses a vector pointing 4 units to the "right" (along X axis) and 3 units "up" (along Y axis) and has the magnitude 5 (which is the vector's length). Vectors are one of the very basic concepts of advanced math and are used almost in any advanced area of math, physics, programming etc. -- basically all of physics and engineering operates with vectors, programmers will mostly encounter them in areas such as 3D graphics, physics engines (forces, velocities, acceleration, ...), machine learning (feature vectors, ...) or signal processing (e.g. Fourier transform just interprets a signal as a vector and transforms it to a different basis) etc. In this article we will implicitly focus on vectors from programmer's point of view (i.e. "arrays of numbers"), which to a mathematician will seem very simplified, but we'll briefly also foreshadow the mathematical view.

Just like in elemental mathematics we deal with "simple" numbers such as 10, -2/3 or pi -- we retrospectively call such "simple" numbers scalars -- advanced mathematics generalizes the concept of such a number into vectors ("arrays of numbers", e.g. [1,0,-3/5] or [0.5,0.5]) and yet further to matrices ("two dimensional arrays of numbers") and defines a way to deal with such generalizations into linear algebra, i.e. we have ways to add and multiply vectors and matrices and solve equations with them, just like we did in elemental algebra (of course, linear algebra is a bit more complex as it mixes together scalars, vectors and matrices). In yet more advanced mathematics the concepts of vectors and matrices are further generalized to tensors which may also be seen as "N dimensional arrays of numbers" but further add new rules and interpretation of such "arrays" -- vectors can therefore be also seen as a tensor (of rank 1) -- note that in this context there is e.g. a fundamental distinction between row and column vectors. Keep in mind that vectors, matrices and tensors aren't the only possible generalization of numbers, another one is e.g. that of complex numbers, quaternions, p-adic numbers etc. Anyway, in this article we won't be discussing tensors or any of the more advanced concepts further, they are pretty non-trivial and mostly beyond the scope of mere programmer's needs :) We'll keep it at linear algebra level.

Vector is not merely a coordinate, though the traditional representation of it suggest such representation and programmers often use vector data types to store coordinates out of convenience (e.g. in 3D graphics engines vectors are used to specify coordinates of 3D objects); vector should properly be seen as a direction and magnitude which has no position, i.e. a way to correctly imagine a vector is something like an arrow -- for example if a vector represents velocity of an object, the direction (where the arrow points) says in which direction the object is moving and the magnitude (the arrow length) says how fast it is moving (its speed), but it doesn't say the position of the object (the arrow itself records no position, it just "hangs in thin air").

Watch out, mathematicians dislike defining vectors as arrays of numbers because vectors are essentially NOT arrays of numbers, such arrays are just one way to express them. Similarly we don't have to interpret any array of numbers as a vector, just as we don't have to interpret any string of letter as a word in human language. A vector is simply a direction and magnitude, an "arrow in space" of N dimensions; a natural way of expressing such arrow is through multiples of basis vectors (so called components), BUT the specific numbers (components) depend on the choice of basis vectors, i.e. the SAME vector may be written as an array of different numbers (components) in a different basis, just as the same concept of a dog is expressed by different words in different languages. Even with the same basis vectors the numbers (components) depend on the method of measurement -- instead of expressing the vector as a linear combination of the N basis vectors we may express it as N dot products with the basis vectors -- the numbers (components) will be different, but the expressed vector will be the same. Mathematicians usually define vectors abstractly simply as members of a vector space which is a set of elements (vectors) along with operations of addition and multiplication which satisfy certain given rules (axioms).

How Vectors Work

Here we'll explain the basics of vectors from programmer's point of view, i.e. the traditional "array of numbers" simplification (expressing a linear combination of basis vectors).

Given an N dimensional space, a vector to us will be an array of real numbers (in programming floats, fixed point or even just integers) of length N, i.e. the array will have N components (2 for 2D, 3 for 3D etc.).

For example suppose 2 vectors in a 2 dimensional space, u = [7,6] and v = [2,-3.5]. To visualize them we may simply plot them:

   6 |            _,
   5 |          __/| u
   4 |       __/
   3 |    __/
   2 | __/
___1_|/._._._._._._._
  -1 |\1 2 3 4 5 6 7
  -2 | \
  -3 | _\| 
  -4 |  "" v
  -5 |   

NOTE: while for normal (scalar) variables we use letters such as x, y and z, for vector variables we usually use letters u, v and w and also put a small arrow above them as:

->         ->
u = [7,6], v = [2,-3.5]

The vector's components are referred to by the vector's symbol and subscript or, in programming, with a dot or square brackets (like with array indexing), i.e. u.x = 7, u.y = 6, v.x = 2 and v.y = -3.5. In programming data types for vectors are usually called vecN or vN where N is the number of dimensions (i.e. vec2, vec3 etc.).

Also note that we'll be writing vectors "horizontally" just as shown, which means we're using so called row vectors; you may also see usage of so called columns vectors as:

->   |7|   ->  |  2 |
u  = |6|,  v = |-3.5|

Now notice that we do NOT plot the vectors as points, but as arrows starting at the origin (point [0,0]) -- this is again because we don't normally interpret vectors as a position but rather as a direction with magnitude. The direction is apparent from the picture (u points kind of top-right and v bottom-right) and can be exactly computed with arcus tangent (i.e. angle of u = atan(6/7) = 40.6 degrees, ...), while the magnitude (also length or norm) is given by the vector's Euclidean length and is denoted by the vector's symbol in || brackets (in programming the function for getting the length may be called something like len, size or abs, even though absolute value is not really a mathematically correct term here), i.e.:

  ->
||u|| = sqrt(7^2 + 6^2) ~= 9.22
 
  ->
||v|| = sqrt(2^2 + -3.5^2) ~= 4.03

In fact we may choose to represent the vectors in a format that just directly says the angle with the X axis (i.e. direction) and magnitude, i.e. v could be written as {40.6 degrees, 9.22...} and u as {-56.32 degrees, 4.03...} (see also polar coordinates). This represents the same vectors, though we don't do this so often in programming.

A vector whose magnitude is exactly 1 is called a unit vector -- such vectors are useful in situations in which we only care about direction and not magnitude.

The vectors u and v may e.g. represent a velocity of cars in a 2D top-down racing game -- the vectors themselves will be used to update each car's position during one game frame and their magnitudes may be displayed as the current speed of each car to their drivers. However keep in mind the vector magnitude may also represent other things, e.g. in a 3D engine a vector may be used to represent camera's orientation and its magnitude may specify e.g. its field of view, or in a physics engine a vector may be used to represent a rotation (the direction specifies the axis of rotation, the magnitude specifies the angle of rotation).

But why not just use simple numbers? A velocity of a car could just as well be represented by two variables like carVelocityX and carVelocityY, why all this fuzz with defining vectors n shit? Well, this simply creates an abstraction that fits to many things we deal with and generalizes well, just like we e.g. define the concept of a sphere and cylinder even though fundamentally these are just sets of points. If for example we suddenly want to make a 3D game out of our racing game, we simply start using the vec3 data type instead of vec2 data type and most of our equation, such as that for computing speed, will stay the same. This becomes more apparent once we start dealing with more complex math, e.g. that related to physics where we have many forces, velocities, momenta etc. This becomes even more apparent when we start to look into operations with vectors which are really what makes vectors vectors.

Some of said operations with vectors include:

   7 | 
   6 |            _,
   5 |          __/|\u
   4 |       __/     \
   3 |    __/         \ u + v
   2 | __/     __..--'/'
___1_|/_...--''____/____
  -1 |\1 2 3 4__/6 7 8 9
  -2 | \   __/
  -3 | _\|/
  -4 |  "" v
  -5 |   

Code Example

TODO


venus_project

The Venus Project

The Venus Project is a big project established by Jacque Fresco, already running for decades, aiming for a voluntary and rational transition towards an ideal, highly technological and automated society without money, scarcity, need for human work, social competition, wars and violence, a society in which people would have abundance thanks to so called resource based economy, where they would be collaborating, loving, respecting the nature, caring for others and free to pursue their true potential. It is similar to the Zeitgeist Movement and TROM. In its views, goals and means the Venus Project is extremely close to LRS and we highly support it, however watch out: while the ideas behind the project are good, the project itself is a bit sus and may show internal corruption just as the FSF, TROM and other projects -- use your brain, follow ideas, not people.

There is a non-profit organization called One Community that tries to pursue goals set by the Venus Project and strive for what they call Highest Good. Its website is at https://www.onecommunityglobal.org.

Overview

{ The following is based mainly on my understanding of what I've read in Fresco's book The Best That Money Can't Buy. I recommend the book for an overall overview of the project. ~drummyfish }

The project is a child of Jacque Fresco, a generalist futurist who sadly died in 2017, who worked on it for many decades with his life partner Roxanne Meadows. It has a center located in Florida and is a nonprofit organization that performs research, education and prototype technology according to their ideas of a future we should strive for.

Although the project seems to be avoiding specific political labels (possibly as to avoid historical associations), it is de facto anarcho pacifist communist movement (i.e. politically the same as LRS). Very nicely it also seems, at least as of 2022, uninfected with the SJW cancer -- fight culture and fascism goes directly against their goals and Fresco explicitly stated that we have to stop constantly fighting for human rights and rather establish a society with human rights built-in.

Fresco highly criticizes today's society and just as us says it only tries to cure the symptoms (search the solutions within the current framework and mindset) rather than the root cause of its issues (the system itself). He mainly criticizes the presence of the monetary system and laws -- currently taking the form of capitalism -- which he correctly blames for most today's issues such as artificial scarcity, hunger, wars, fascism, lack of social security, poverty, wage slavery, destruction of natural environment, waste, energy crisis, planned obsolescence, deteriorating psychological health etc. He says with the presence of advanced technology we have this system is highly outdated (for example it forces artificial scarcity because only scarce resources can be sold, unlike for example air) and points out the fact that we have more than enough resources for everyone on Earth and could live in abundance and peace, practicing collaboration rather than competition. Therefore he argues that we have to eliminate money, barter and markets from the society and change the very basis of whole society, down to our mentality and outdated historical associations (and eventually even language which should be closer to the scientific language).

He argues to replace the monetary system with so called resource based economy (RBE) which should be a pillar of the future society. RBE is called an "economy" but doesn't use any money or barter, it starts by declaring that all Earth's resources are a common heritage of all people on Earth -- it basically means that "everything is available to everyone", i.e. no one can own resources of the Earth, they belong to us all and whoever needs something can take it. RBE would be supported by high automatization and computer monitoring to deliver resources where they are needed. Normies usually can't comprehend that this could work, they say "but then someone will just steal everything", but Fresco correctly argues that with correct and rational use of our technology we can, unlike in the past, already extract as many resources as to satisfy everyone with high abundance; basically we can make for example food as abundant as is air nowadays -- no one will be (and can be) stealing food when there's more free food than anyone can eat, just as stealing air isn't a concern nowadays.

This should therefore also eliminate the need for complex laws -- when no one is stealing, we don't need laws against stealing etc. Elimination of money and laws will remove the need for bullshit jobs such as lawyers, judges, politicians, marketing guys, bankers etc., freeing more people and getting rid of a lot of unnecessary work and burden of society.

Fresco supports this by the fact that human behavior is determined by the environment and upbringing -- nowadays we have criminality mostly because firstly people are poor, i.e. pushed into illegal activity, and secondly nurtured by the competitive propaganda that teaches them, right from little children, to fight and compete with others. In a caring society that provides all their needs and raises them in the spirit of collaboration and love towards others criminals will be almost non existent, there will simply be no gain from it.

The project further seeks to eliminate the need for human work: all work, including complex decision making, would be automated. Bullshit jobs would be removed and maintenance reduced to minimum. People would be free to pursue their true interests and could fully and freely devote themselves to it. Again normies usually say something like "BUT THEN NOBODY AIN'T GONNA WORK". Well, firstly that wouldn't be an issue since no human work would even be needed anymore, and secondly Fresco correctly answers by saying that competition and force isn't the only drive of human activity, people are motivated for work and creative activity by other phenomena such as curiosity, sense of accomplishment, boredom, moral values etc., and usually even perform better than when forced to it. Maslow's Hierarchy of Needs is a well known psychological model that says that once basic needs such as food and shelter are satisfied (which RBE will accomplish), people start voluntarily pursuing higher needs such as art, science and other creative work. People did work and create long before money and jobs existed. The idea of reducing or eliminating human work is already being considered nowadays in the form of universal basic income -- experiments have been confirming that it works.

Venus Project stresses the important of science -- their approach is strictly scientific -- technology (such as AI and sensors all over the Earth) and rationality and argues for application of technology to everything as that, in their opinion, will allow RBE and remove the need for human government. There should be no human governing the society, decisions will be made mostly by machines in a network of decentralized cities all over the Earth ("technophobes" are informed that even nowadays we put our lives into the hands of machines, e.g. in planes or with pacemakers, and they do better job than humans). Technology should be sustainable, respect the nature and be aligned with it, i.e. not fight against it but rather direct its forces towards good causes. Protection of the environment and integration of natural elements in cities is stressed. Only clean and safe energy would be used. Earth carrying capacity should be respect, i.e. people would avoid overpopulation by voluntary birth control.

The project is for absolute freedom of information -- there would be no intellectual property (copyright, patents, ...), no trade secrets, state secrets and probably also no personal secrets, as in a non-competitive society there wouldn't be a danger of abusing personal information. They argue that despite computer sensors being present everywhere, there would simply be no need for surveillance of people as there would be no corporations, no criminality etc.

It also opposes nationalism, racism and other forms of privilege and inequality. However this shouldn't be forced in the SJW style, it should rather come naturally thanks to fixing the the root cause of these issues (removing competition, governments, money etc.).

Education would play a huge role in the society -- again, it wouldn't be forced on children, they should want to go to school because education would be fun and give them freedom to pursue their interests. There would be no grades and it should teach high scientific and critical thinking, rational discussion, nonviolent resolution of conflicts, collaboration through group project and collaborative games, love towards nature (e.g. by projects involving growing plants) etc. Generalism would be preferred before hyper specialization (which we see nowadays).

Fresco also addresses the fear of some people of people becoming too uniform and losing individuality -- he stresses that individuality would be focused on, uniformity would only lie in common goals and caring for all humans and nature. Unlike in current society, each human would have the freedom to pursue his true interests and goals.

The project claims it has years of research and seems to have a great number of specific ideas for what the technology might look like, how we would harness energy, travel etc. There are many 3D visualizations. Fresco claims that in the new society everyone would have a higher living standard than the rich have nowadays.

The transition towards this society should be peaceful and evolutionary, NOT revolutionary. It has to be voluntary and rational. The initial stage -- building the first center with the project ideas in mind -- has already been completed in Florida. Raising funds and educating the public should continue, then more cities in the spirit of the project should start to appear, interconnect and prove the ideas in practice. Then slowly the new cities and ideas would start to replace our current system.

Comparison With LRS

We, LRS, highly support and agree with the Venus Project as an idea, in its analysis of current society, goals and means of achieving it. At least as of 2022, we can't know if any single project will become corrupt in the future (e.g. with SJWs). We may still disagree on some details, focus a bit more on different areas etc. Here are a few points about that.

Venus Project seems to only focus on humans, unlike LRS which is based on the love of all life, i.e. also animals, possibly even alien life etc. Venus Project mentions that in the future there would possibly be fish farms -- for us this seems unacceptable as we advocate vegetarianism, even the lives of fish are precious to us. In a highly advanced society artificial meat (which we accept) would probably be available and replace meat from any living animals so we would eventually align with Venus Project, but the human-centeredness of Venus Project is still there.

It may seem we also focus on simplicity of technology (e.g. sucklessness) while Venus Project seems to advocate bloat and overapplication of technology. This may not be such an issue because a truly good technology that Venus Project advocates should converge towards simplicity naturally thanks to minimizing maintenance, maximizing safety (minimizing dependencies), removal of bullshit features etc. In other word even hi-tech advocated by Venus Project can be done in a suckless way, for example the automation would work on top of Unix operating systems. Still the future from LRS point of view may look less hi-tech, we might prefer simple buttons to voice recognition and so on :-)

And a bit more criticism: the project doesn't seem to practice free culture and free software, even though of course it would implement them in their society -- it kind of makes sense as they seem to be trying to be above current movements, they simply think we should focus beyond them. We might disagree and say that even looking into the far future we should still keep an eye on the now, education about free culture can greatly contribute to education about the advantages of information freedom etc. Furthermore they are selling some videos on their site, which we don't really like but the project justifies it as raising funds for their operation. To their credit they have many gratis videos and educational material, even the books can be found as "free download". Another criticism comes towards the materials themselves which are sometimes a bit unprofessional which is a shame (e.g. the book has many typos and is not so readable). Also there seems to be a bit of personality cult around Jacque and Roxanne, their faces are all over the place and even though they seem like really great people and even though it may simply be due to the lack of other "strong personalities", this makes the movement look like a religious cult to some critics. Tio, the guy behind TROM who collaborated with Venus Project also expressed slight criticism of organization of the project, that they were too concerned about control over their materials and that he even met a few toxic people there, though he says the experience was till mostly positive. We have to keep in mind that people, teams and projects are imperfect, they can become spoiled and fail, however this changes nothing about the ideas the project presents, which we support. As always, we have to separate ideas and people -- the situation here is perhaps similar to free software as an idea, which we fully support, vs free software foundation as a project and team of people, which has a few issues.

History

TODO

See Also


version_numbering

Version Numbering

TODO

LRS Version Numbering

At LRS we suggest the following version numbering (invented/employed by drummyfish): { OFC I don't know if anyone else is already doing this, I'm not claiming any history firsts, just that this is what I independently started doing in my projects. ~drummyfish }

This is a simple system that tries to not be dependent on having a version control system such as git, i.e. this system works without being able to make different development branches and can be comfortably used even if you're just developing a project in a local directory without any fancy tools. Of course you can use a VCS, this system will just make you depend less on it so that you can make easier transitions to a different VCS or for example drop a VCS altogether and just develop your projects in a directory with FTP downloads.

Version string is of format major.minor with optional suffix letter d, e.g. 0.35, 1.0 or 2.1d.

Major and minor numbers have the usual meaning. Major number signifies the great epochs of development and overall project state -- 0 means the project is in a state before the first stable, highly usable, optimized, tested and refactored release that would have all initially set goals and features implemented. 1 means the project has already reached such state. Any further increment signifies a similarly significant change in the project (API overhaul, complete rewrite, extreme number of new features etc.) that has already been incorporated with all the testing, optimization, refactoring etc. At the start of each major version number the minor version number is set to 0 and within the same major version number a higher minor number signifies smaller performed changes (bug fixes, smaller optimizations, added/removed individual features etc.). Minor number doesn't have to be incremented by 1, the value may try to intuitively reflect the significance of the implemented changes (e.g. versions 0.2 and 0.201 differ only very slightly whereas versions 0.2 and 0.5 differ significantly).

From the user's perspective a greater major.minor number signifies a newer version of the project and the user can be sure that versions with the same major.minor number are the same.

The main difference against traditional version numberings is the optional d suffix. This suffix added to version number X signifies an in-development (non-release) branch based on version X. If a version has the d suffix, it doesn't have to change major and minor numbers with implemented changes, i.e. there may be several different versions of the project whose version number is for example 0.63d. I.e. with the d suffix it no longer holds that versions with the same number are necessarily the same. This allows developer to not bother about incrementing version number with every single change (commit). The developer simply takes the latest release version (the one without d suffix), adds the d suffix, then keeps modifying this version without caring about changing the version number with each change, and when the changes are ready for release, he removes the d suffix and increases the version number. The user may choose to only use the release (stable, tested, ...) version without the suffix or he can take the risk of using the latest development version (the one with the d suffix).

With this a project can be comfortably developed in s single git branch without managing separate stable and development branches. The head of the branch usually has the latest in-development version (with the d suffix) but you may just link to previous commits of release versions (without the d suffix) so that users can download the release versions. You can even not use any VCS and just develop your project in a directory and make hard backups of each release version.

Here is an example of version numbering a project with the LRS system:

version description release?
0.0 just started YES
0.0d added license, Makefile, ... no
0.0d added some code no
0.0d added more code no
0.01 it already does something! YES
0.01d added more features no
0.01d fixed some bugs no
0.01d refactored no
0.2 it is already a bit usable! YES
0.2d added more features no
... ... ...
0.9d fixed bugs no
0.9d refactored no
1.0 nice, stable, tested, usable! YES
1.0d added a small optimization no
... ... ...
2.0 complete rewrite for perform. YES
2.0d added a few features no
2.0d fixed a bug no
2.01 a few nice improvements YES
... ... ...

vim

Vim

{ This is WIP, I use Vim but am not such guru really so there may appear some errors, I know this topic is pretty religious so don't eat me. ~drummyfish }

Vim (Vi Improved) is a legendary free as in freedom, fairly (though not hardcore) minimalist and suckless terminal-only (no GUI) text editor for skilled programmers and hackers, and one of the best editors you can choose for text editing and programming. It is a successor of a much simpler editor vi that was made in 1976 and which has become a standard text editor installed on every Unix system. Vim added features like tabs, syntax highlight, scriptability, screen splitting, unicode support, sessions and plugins and as such has become not just a simple text editor but an editor that can comfortably be used for programming instead of any bloated IDE. Observing a skilled Vim user edit text is really like watching a magician or a literal movie hacker -- the editing is extremely fast, without any use of mouse, it transcends mere text editing and for some becomes something akin a way of life.

Vim is generally known to be "difficult to learn" -- it is not because it is inherently difficult but rather for being very different from other editors -- it has no GUI (even though it's still a screen-oriented interactive TUI), it is keyboard-only and is operated via text commands rather than with a mouse, it's also preferable to not even use arrow keys but rather hjkl keys. There is even a meme that says Vim is so difficult that just exiting it is a non-trivial task. People not acquainted with Vim aren't able to do it and if they accidentally open Vim they have to either Google how to close it or force kill the terminal xD Of course it's not so difficult to do, it's a little bit different than in other software -- you have to press escape, then type :q and press enter (although depending on the situation this may not work, e.g. if you have multiple documents open and want to exit without saving you have to type :wqa etc.). The (sad) fact is that most coding monkeys and "professional programmers" nowadays choose some ugly bloated IDE as their most important tool rather than investing two days into learning Vim, probably the best editor.

Why use Vim? Well, simply because it is (relatively) suckless, universal and extremely good for editing any text and for any kind of programming, for many it settles the search for an editor -- once you learn it you'll find it is flexible, powerful, comfortable, modifiable, lightweight... it has everything you need. Anyone who has ever invested the time to learn Vim will almost certainly tell you it was one of the best decisions he made and that guy probably only uses Vim for everything now. Many people even get used to it so much they download mods that e.g. add Vim commands and shortcuts to programs like web browsers. A great advantage is that vi is installed on every Unix as it is a standard utility, so if you know Vim, you can just comfortably use any Unix-like system just from the command line: when you ssh into a server you can simply edit files without setting up any remote GUI or whatever. Therefore Vim is automatically a must learn skill for any sysadmin. A huge number of people also use Vim for "productivity" -- even though we don't fancy the productivity cult and the bottleneck of programming speed usually isn't the speed of typing, it is true that Vim makes you edit text extremely fast (you don't move your hands between mouse and keyboard, you don't even need to touch the arrow keys, the commands and shortcuts make editing very efficient). Some nubs think you "need" a huge IDE to make big programs, that's just plain wrong, you can do anything in Vim that you can do in any other IDE, it's as good for editing tiny files as for managing a huge codebase.

Vim's biggest rival is Emacs, a similar editor which is however more complex and bloated (it is joked that Emacs is really an operating system) -- Vim is more suckless, yet not less powerful, and so it is naturally the choice of the suckless community and also ours. Vim and Emacs are a subject of a holy war for the the best editor yet developed; the Emacs side calls itself the Church of Emacs, led by Richard Stallman (who created Emacs) while the Vi supporters are called members of the Cult of Vi (vi vi vi = 666).

It has to be noted that Vim as a program is still kind of bloated, large part of the suckless community acknowledges this (cat-v lists Vim as harmful, recommends Acme, Sam or ed instead). Nonetheless the important thing is that Vim is a good de facto standard -- the Vim's interface and philosophy is what matters the most, there are alternatives you can comfortably switch to. The situation is similar to for example "Unix as a concept", i.e. its interface, philosophy and culture, which together create a certain standardization that allows for different implementations that can be switched without much trouble. In the suckless community Vim has a similar status to C, Linux or X11 -- it is not ideal, by the strict standards it is a little bit bloated, however it is one of the best existing solutions and makes up for its shortcomings by being a stable, well established de-facto standard.

How To

These are some Vim basics for getting started. There are two important editing modes in Vim:

Some important commands in command mode are:

Vim can be configured with a file named .vimrc in home directory. In it there is a set of commands that will automatically be run on start. Example of a simple config file follows:

set number " set line numbering
set et     " expand tabs
set sw=2
set hlsearch
set nowrap          " no line wrap
set colorcolumn=80  " highlight 80th column
set list
set listchars=tab:>.
set backspace=indent,eol,start
syntax on

Alternatives

See also a nice big list at http://texteditors.org/cgi-bin/wiki.pl?ViFamily.

Of course there are alternatives to Vim that are based on different paradigms, such as Emacs, its biggest rival, or plan9 editors such as Acme. In this regard any text editor is a potential alternative. Nevertheless people looking for Vim alternatives are usually looking for other vi-like editors. These are for example:


viznut

Viznut

Viznut (real name Ville-Matias Heikkilä) is a Finnish demoscene programmer, hacker and artist that advocated high technological minimalism. He is known for example for his countercomplex blog, co-discovering bytebeat and creating IBNIZ. In his own words, he believes much more can be done with much less. He also warns of collapse (http://viznut.fi/en/future.html). According to his Fediverse page he lives in Turku, Finland, was born around 1977 and has been programming since the age of seven.

His work is pretty based, in many ways aligned with LRS, he contributed a great deal to minimalist technology. Unfortunately in some ways he also seems pretty retarded: he uses facebook, twitter and github and also mentions "personal pronouns" on his twitter xD Pretty disappointing TBH. This would make Viznut a type A fail.

His personal site is at http://viznut.fi/en/ and his blog at http://countercomplex.blogspot.com/. He collects many files at http://viznut.fi/files/, including very interesting writings about demoscene, programming experiments etc.

In 2011 he released IBNIZ, a tiny SDL virtual machine and a language that's meant as a platform for creating demos. Also in 2011 he was involved in the discovery of bytebeat, a way of creating music with extremely simple C expressions -- later he published a paper about it. In 2012 he founded Skrolli, a magazine about sustainable/non-consumerist technology. In about 2019 he released PC-lamerit, an animated series about 90s computer hackers, made as an executable computer program -- it looks pretty cool. He also created UNSCII, a fixed-width font usable for ANSI art.


watchdog

Watchdog

Watchdog is a special timer that serves as a safety mechanism for detecting malfunction of computer programs at run time by requiring programs to periodically reset the timer.

Basically watchdog keeps counting up and a correctly behaving program is supposed to periodically reset this count ("kick" or "feed the dog") -- if the reset doesn't happen for a longer period, the watchdog counts up to a high value and alerts that something's wrong ("the dog starts barking"), e.g. with an interrupt or a signal. This can mean for example that the program has become stuck in an infinite loop or that its instructions were corrupted and the program control jumped to some unintended area of RAM and is doing crazy shit. This is usually handled by resetting the system so as to prevent possible damage by the program gone insane, also logs can be made etc. Watchdogs are very often used in embedded systems. Operating systems may also use them to detect nonresponsive processes.

Watchdog is similar to the dead man's switch used e.g. in trains where the operator is required to periodically push a button otherwise the train will automatically activate brakes as the operator is probably sleeping or dead.

See Also


wavelet_transform

Wavelet Transform

Good luck trying to understand the corresponding Wikipedia article.

Wavelet transform is a mathematical operation, similar to e.g. Fourier transform, that takes a signal (e.g. audio or an image) and outputs information about the frequencies contained in that signal AS WELL as the locations of those frequencies. This is of course extremely useful when we want to analyze and manipulate frequencies in our signal -- for example JPEG 2000 uses wavelet transforms for compressing images by discarding certain frequencies in them that our eyes are not so sensitive to.

The main advantage over Fourier transform (and similar transforms such as cosine transform) is that wavelet transform shows us not only the frequencies, but ALSO their locations (i.e. for example time at which these frequencies come into play in an audio signal). This allows us for example to locate specific sounds in audio or apply compression only to certain parts of an image. While localizing frequencies is also possible with Fourier transform with tricks such as spectrograms, wavelet transforms are a more elegant, natural and continuous way of doing so. Note that due to Heisenberg's uncertainty principle it is mathematically IMPOSSIBLE to know both frequencies and their locations exactly, there always has to be a tradeoff -- the input signal itself tells us everything about location but nothing about frequencies, Fourier transform tells us everything about frequencies but nothing about their locations and wavelet transform is a midway between the two -- it tells us something about frequencies and their approximate locations.

Of course there is always an inverse transform for a wavelet transform so we can transform the signal, then manipulate the frequencies and transform it back.

Wavelet transforms use so called wavelets (tiny waves) as their basis function, similarly to how Fourier transform uses sine/cosine functions to analyze the input signal. A wavelet is a special function (satisfying some given properties) that looks like a "short wave", i.e. while a sine function is an infinite wave (it goes on forever), a wavelet rises up in front of 0, vibrates for a while and then gradually disappears again after 0. Note that there are many possible wavelet functions, so there isn't a single wavelet transform either -- wavelet transforms are a family of transforms that each uses some kind of wavelet as its basis. One possible wavelet is e.g. the Morlet wavelet that looks something like this:

                        _
                       : :
                      .' '.
                      :   :
             .'.      :   :      .'.
             : :     :     :     : :
            .'  :    :     :    :  '.
____  ..    :   :    :     :    :   :    ..  ___
    ''  '. .'    :   :     :   :    '. .'  ''
         '_'     :   :     :   :     '_'
                 :   :     :   :
                  : :       : :
                  : :       : :
                  '.'       '.'

The wavelet is in fact a complex function, what's shown here is just its real part (the imaginary part looks similar and swings in a perpendicular way to real part). The transform can somewhat work even just with the real part, for understanding it you can for start ignore complex numbers, but working with complex numbers will eventually create a nicer output (we'll effectively compute an envelope which is what we're interested in).

The output of a wavelet transform is so called scalogram (similar to spectrum in Fourier transform), a multidimensional function that for each location in the signal (e.g. time in audio signal or pixel position in an image) and for each frequency gives "strength" of influence of that frequency on that location in the signal. Here the "influence strength" is basically similarity to the wavelet of given frequency and shift, similarity meaning basically a dot product or convolution. Scalogram can be computed by brute force simply by taking each possible frequency wavelet, shifting it by each possible offset and then convolving it with the input signal.

For big brains, similarly to Fourier transform, wavelet transform can also be seen as transforming a point in high dimensional space -- the input function -- to a different orthogonal vector basis -- the set of basis vectors represented by the possible scaled/shifted wavelets. I.e. we literally just transform the function into a different coordinate system where our coordinates are frequencies and their locations rather than locations and amplitudes of the signal (the original coordinate system).

TODO


web

Web

Ouch, this is embarrassing!

The article is actually here.


whale

Whale

In online pay to win games a whale is a player who spends enormous sums of money, much more than most of other players combined. They buy the most expensive items in the game stores daily, they highly engage in microtheft and may throw even thousands of dollars at the game every day (cases of players spending over 1 million dollars on a casual game are known). In the playerbase there may exist just a handful of whale players but those are normally the main source of income for the game so that the developers actually focus almost exclusively on those few players, they craft the game to "catch the whales". The income from the rest of the players is negligible -- nevertheless the non-whales also play an important role, they are an attraction for the whales, they are there so that they can be owned by the whale players.

Existence of whale players is one of the demonstrations of the pareto principle (80/20 rule): 80% of the game's income comes from 20% of the players, out of which, by extension, 80% again comes from the 20% and so on.

The terminology can be extended further: besides whales we may also talk about dolphins (mid-spending players) and minnows (low spending players). Extreme whales are sometimes called white whales or super whales (about 0.2% generating 50% of income).

In some games, such as WoW, players may buy multiple accounts and practice so called multiboxing. This means they control multiple characters at once, often using scripts to coordinate them, which of course gives a great advantage. Though using scripts or "hacking" the game in similar ways is in other cases considered unacceptable cheating that results in immediate ban, for obvious reasons (money) developers happily allow this -- of course this just shows they don't give a single shit about fairness or balance, they only thing they care about is $$$profit$$$.


wiby

Wiby

Wiby is a minimalist non-corporate web search engine for old-style non-bloated (web 1.0, "smol web") websites with its custom index. Searching on wiby will yield small, simple websites, mostly non-interactive, static HTML personal/hobby sites, small community sites and obscure weird sites -- this kind of searching is not only fun, adventurous and nostalgic 90s like experience, but it actually leads to finding useful information which on corporate search engines like Google or Bing get buried under billions of useless noise sites and links to "content platforms" like YouTube and reddit. We highly recommend searching on wiby.

It can be accessed at https://wiby.me and https://wiby.org (there is a low res picture of a lighthouse of Cape Spear on the frontpage for some reason). Of course, no JavaScript is needed! Clicking "surprise me" on wiby is an especially entertaining activity, you never know what comes at you. A site dedicated to identifying historical bottles? Ice chewers forum? A list of longest domain names? Yes, this is the kind of stuff you'll get, and more.

The engine doesn't automatically crawl the whole web, it instead works by users submitting links, the admin approving them and a bot potentially crawling these sites to a small depth. Be sure to contribute quality links to improve the database!

Wiby appears to have been launched in October 2017 and built by a sole programmer who remains anonymous and accepts donations. On the ASCII art on the front page (now replaced by a gif) there are initials jgs which may or may not point to the author of wiby.

On July 8, 2022 wiby became even more amazing by being released as free (as in freedom) software under GPLv2 (https://github.com/wibyweb/wiby/)! It works on the LEMP stack. See http://wiby.me/about/guide.html. (The database/index of sites though seems to remain non-shared and proprietary.)

A similar search engine seems to be https://search.marginalia.nu/.


wiki_authors

LRS Wiki Authors

Contributors, list yourselves here if you have made at least one contribution. This helps keep track of people for legal reasons etc.

Contributors to this wiki include:


wikidata

Wikidata

Wikidata is a large collaborative project (a sister project of Wikipedia, hosted by Wikimedia Foundation) for creating a huge noncommercial public domain database containing information basically about everything. Well, not literally everything -- there are some rules about what can be included that are similar to those on Wikipedia, e.g. notability (you can't add yourself unless you're notable enough, of course you can't add illegal data etc.). Wikidata records data in a form of so called knowledge graph, i.e. it connects items and their properties with statements such as "Earth:location:inner Solar System", creating a mathematical structure called a graph. The whole database is available to anyone for any purpose without any conditions, under CC0!

Wikidata is wildly useful and greatly overlooked in the shadow of Wikipedia even though it offers a way to easily obtain large, absolutely free and public domain data sets about anything. The database can be queried with specialized languages so one can e.g. get coordinates of all terrorist attacks that happened in certain time period, a list of famous male cats, visualize the tree of biological species, list Jews who run restaurants in Asia or any other crazy thing. Wikidata oftentimes contains extra information that's not present in the Wikipedia article about the item and that's not even quickly found by googling, and the information is sometimes also backed by sources just like on Wikipedia, so it's nice to always check Wikidata when researching anything.

Wikidata was opened on 30 October 2012. The first data that were stored were links between different language versions of Wikipedia articles, later Wikipedia started to use Wikidata to store information to display in infoboxes in articles and so Wikidata grew and eventually became a database of its own. As of 2022 there is a little over 100 million items, over 1 billion statements and over 20000 active users.

Database Structure

The database is a knowledge graph. It stores the following kinds of records:

The most important properties are probably instance of (P31) and subclass of (P279) which put items into sets/classes and establish subsets/subclasses. The instance of attribute says that the item is an individual manifestation of a certain class (just like in OOP), we can usually substitute is with the word "is", for example Blondi (Q155695, Hitler's dog) is an instance of dog (Q144); note that an item can be an instance of multiple classes at the same time. The subclass of attribute says that a certain class is a subclass of another, e.g. dog (Q144) is a subclass of pet (Q39201) which is further a subclass of domestic animal (Q622852) etc. Also note that an item can be both an instance and a class.

How To

There are many libraries/APIs for wikidata you can use, unlike shitty corporations that guard their data by force wikidata provides data in friendly ways -- you can even download the whole wikidata database in JSON format (about 100 GB).

The easiest way to retrieve just the data you are interested in is probably going to the online query interface (https://query.wikidata.org/), entering a query (in SPARQL language, similar to SQL) and then clicking download data -- you can choose several formats, e.g. JSON, CSV etc. That can then be processed further with whatever language or tool, be it Python, LibreOffice Calc etc.

BEWARE: the query you enter may easily take a long time to execute and time out, you need to write it nicely which for more complex queries may be difficult if you're not familiar with SPARQL. However wikidata offers online tips on optimization of queries and there are many examples right in the online interface which you can just modify to suit you.

Here are some example of possible queries. The following one selects video games of the FPS genre:

SELECT ?item ?itemLabel WHERE 
{
  ?item wdt:P31 wd:Q7889.    # item is video game and
  ?item wdt:P136 wd:Q185029. # item is FPS
  
  # this gets the item label:
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
LIMIT 100 # limit to 100 results, make the query faster

Another query may be this one: select black holes along with their mass (where known):

SELECT ?item ?itemLabel ?mass WHERE
{
  { ?item wdt:P31 wd:Q589. } # instances of black hole
  UNION
  { ?item wdt:P31 ?class. # instance of black hole subclass (e.g. supermassive blackhole, ...) 
    ?class wdt:P279 wd:Q589. }

  OPTIONAL { ?item wdt:P2067 ?mass }
  
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}

wikipedia

Wikipedia

3 + 2 = 5^[citation_needed] --Wikipedia

Wikipedia is a non-commercial, free/open censored pseudoleftist online encyclopedia of general knowledge written mostly by volunteers, running on free software, allowing politically approved individuals from the public to edit a subset of its less visible non-locked articles (i.e. being a wiki); it is the largest and perhaps most famous encyclopedia created to date. It is licensed under CC-BY-SA and is run by the nonprofit organization Wikimedia Foundation. It is accessible at https://wikipedia.org. Wikipedia is a mainstream information source and therefore politically censored^1234567891011121314151617181920.

WARNING: DO NOT DONATE TO WIKIPEDIA as the donations aren't used so much for running the servers but rather for their political activities (which are furthermore unethical). See https://lunduke.locals.com/post/4458111/the-wiki-piggy-bank. Also please go vandalize Wikipedia right now, it's become too corrupt and needs to go down, vandalizing is fun and you'll get banned sooner or later anyway :) Some tips on vandalizing Wikipedia can be found at https://encyclopediadramatica.online/Wikipedia#Tips_On_Vandalizing_Wikpedia.

{ Lol I'm banned at Wikipedia now (UPDATE: blocked globally on all their sites now, can't even log in and defend on my talk page), reason being I expressed unpopular opinions on my personal website OUTSIDE Wikipedia :D UPDATE: one guy messaged me more people started to be banned and invited me to an anti-wikipedia forum here https://wikipediasucks.co/forum/, check it out. Also some more stuff on censorship and bias on Wikipedia: https://www.serendipity.li/cda/censorship_at_wikipedia.htm. ~drummyfish }

Shortly after the project started in 2001, Wikipedia used to be a great project -- it was very similar to how LRS wiki looks right now; it was relatively unbiased, objective, well readable and used plain HTML and ASCII art (see it as https://nostalgia.wikipedia.org/wiki/HomePage), however over the years it got corrupt and by 2020s it has become a political battleground and kind of a politically correct joke. A tragic and dangerous joke at that. It's still useful in many ways but it just hardcore censors facts and even edits direct quotes to push a pseudoleftist propaganda. Do not trust Wikipedia, especially on anything even remotely touching politics, always check facts elsewhere, e.g. in old paper books, on Metapedia, Infogalactic etc. As old Wikipedia is still accessible, you may also browse the older, less censored version, to see how it deranged from a project seeking truth to one abusing its popularity for propaganda.

Wikipedia exists in many (more than 200) versions differing mostly by the language used but also in other aspects; this includes e.g. Simple English Wikipedia or Wikipedia in Esperanto. In all versions combined there are over 50 million articles and over 100 million users. English Wikipedia is the largest with over 6 million articles.

There are also many sister projects of Wikipedia such as Wikimedia Commons that gathers free as in freedom media for use on Wikipedia, WikiData, Wikinews or Wikisources.

Information about hardware and software used by Wikimedia Foundation can be found at https://meta.wikimedia.org/wiki/Wikimedia_servers. As of 2022 Wikipedia runs of the traditional LAMP framework and its website doesn't require JavaScript (amazing!). Debian GNU/Linux is used on web servers (switched from Ubunatu in 2019). The foundation uses its own wiki engine called MediaWiki that's written mainly in PHP. Database used is MariaDB. The servers run on server clusters in 6 different data centers around the world which are rented: 3 in the US, 3 in Europe and 1 in Asia.

Wikipedia was created by Jimmy Wales and Larry Sanger and was launched on 15 January 2001. The basic idea actually came from Ben Kovitz, a user of wikiwikiweb, who proposed it to Sanger. Wikipedia was made as a complementary project alongside Nupedia, an earlier encyclopedia by Wales and Sanger to which only verified experts could contribute. Wikipedia of course has shown to be a much more successful project.

There exist forks and alternatives to Wikipedia. Simple English Wikipedia can offer a simpler alternative to sometimes overly complicated articles on the main English Wikipedia. Citizendium is a similar online encyclopedia co-founded by Larry Sanger, a co-founder of Wikipedia itself, which is however proprietary (NC license). Citizendium's goal is to improve on some weak points of Wikipedia such as its reliability or quality of writing. Metapedia and Infogalactic are a Wikipedia forks that are written from a more rightist/neutral point of view. Infogalactic is also a Wikipedia fork that tries to remove the pseudoleftist bullshit etc. Encyclopedia Britannica can also be used as a nice resource: its older versions are already public domain and can be found e.g. at Project Gutenberg, and there is also a modern online version of Britannica which is proprietary (and littered with ads) but has pretty good articles even on modern topics (of course facts you find there are in the public domain). Practically for any specialized topic it is nowadays possible to find its own wiki on the Internet.

Good And Bad Things About Wikipedia

Let's note a few positive and negative points about Wikipedia, as of 2022. Some good things are:

And the bad things are (see also this site: http://digdeeper.club/articles/wikipedia.xhtml):

Fun And Interesting Pages

There are many interesting and entertaining pages and articles on Wikipedia, some of them are:

Alternatives

Due to mass censorship and brainwashing going on at Wikipedia it is important to look for alternatives that are important especially when researching anything connected to politics, but also when you just want a simpler, more condensed explanation of some topic. There exist other online encyclopedias like Metapedia, Citizendium or Britannica online, as well as printed enclopedias and old digitized encyclopedias like Britannica 11th edition. For a more comprehensive list of Wikipedia alternatives see the article on encyclopedias.

{ See also old Wikipedia at https://nostalgia.wikipedia.org/wiki/Race. ~drummyfish }

See Also


wiki_rights

LRS Wiki Usage Rights

This page serves to establish usage rights for the whole LRS wiki. It is here to be part of the work so that the legal rights are always clear, even if e.g. the README gets lost somewhere along the way.

Everything on this wiki has been created from scratch solely by people listed in wiki authors. Great care has been taken to make sure no copyrighted content created by other people has been included in any way. This is because one of the goals of this wiki is to be completely in the public domain world wide.

Each contributor has agreed to release the whole LRS Wiki under the Creative Commons Zero 1.0 (CC0 1.O) waiver, available at https://creativecommons.org/publicdomain/zero/1.0/, with an additional option for you to also freely choose the following waiver instead:

The intent of this waiver is to ensure that this work will never be encumbered by any exclusive intellectual property rights and will always be in the public domain world-wide, i.e. not putting any restrictions on its use.

Each contributor to this work agrees that they waive any exclusive rights, including but not limited to copyright, patents, trademark, trade dress, industrial design, plant varieties and trade secrets, to any and all ideas, concepts, processes, discoveries, improvements and inventions conceived, discovered, made, designed, researched or developed by the contributor either solely or jointly with others, which relate to this work or result from this work. Should any waiver of such right be judged legally invalid or ineffective under applicable law, the contributor hereby grants to each affected person a royalty-free, non transferable, non sublicensable, non exclusive, irrevocable and unconditional license to this right.


wiki_style

Wiki Style

This outlines the style and rules of this Wiki that should ensure "quality" and consistency. You should read this before contributing.

If you contribute, add yourself to wiki authors! You can also join us on the Island.

Rules

  1. Everything is public domain under CC0 to which all contributors agree. No one owns what we write here.
  2. No fair use or even unfair use. We want this Wiki to be as free as possible and don't thread the fine legal lines. That means you can't directly include anything on this Wiki if it's copyrighted, even if it's under a free license. So generally avoid any copy pasting and rather try to write everything yourself.
  3. No unnecessary censorship. Necessary censorship basically only includes spam, shitty content, IP-infected content (content that would make this wiki not be in public domain) and hardcore illegal stuff that would immediately put us in jail, though we would of course love to include it. However spreading truth mustn't be hurt by fear of jail. Controversial/incorrect/taboo content etc. is NOT to be censored.

Style

Articles should be written to be somewhat readable and understandable to tech savvy people who already know something about technology, i.e. neither illiterates, nor experts only (as is sometimes the case e.g. on Wikipedia). Each article should ideally start with a general dictionary definition and continue with a simple general explanation and overview of the topic. With more paragraphs the text can get more complex. The idea is that a noob will read the first paragraph, understand the basic idea and take something away. A more advanced reader will read further on and take away more things etc. I.e. we educate in a top-down approach. Each article should be a nice mini resource in itself, quality should be preferred over quantity: for example the article on chess should be a nice general page about chess with focus on its programming, but also containing general overview, history, fun and interesting facts, data, essay elements and so on, so as to be highly self-contained (as opposed to the "Wikipedia approach" of making many separate articles on chess history, chess players, chess rules etc.).

Sources

These are some sources you can use for research and gathering information for articles:

Purpose

Basically the purpose of this wiki is to have fun and also create a coherent and consistent philosophy of a truly good software, technology and also to a related degree society.

Additionally we want to create a nice, completely public domain, suckless educational resource, an encyclopedia that's written in a nicer style than e.g. Wikipedia.

Another goal is to serve as a historical account of our age. We suppose that people in the future may want to read first-hand accounts of how people saw the society and its technology, what they thought was important, what they predicted etc. We like to do this ourselves thanks to e.g. Project Gutenberg. Kids in the future will be like "haha those idiots couldn't see they would bloat themselves to death", this Wiki will show that indeed we foresaw it, we were just in a huge minority. Future kids: don't repeat the same mistakes we did.


wikiwikiweb

WikiWikiWeb

WikiWikiWeb (also c2 Wiki or just Wiki) was the first ever created wiki (user editable) website, created in 1995 in Perl by Ward Cunningham. It was focused on software engineering and computer technology in general but included a lot of discussion and pages touching on other topics, e.g. those of politics, humor or nerd and hacker culture. The principles on which this site worked, i.e. allowing users to edit its pages, greatly influenced a lot of sites that came after that are now generally called wikis, of which most famous is Wikipedia. The style of WikiWikiWeb was partly an inspiration for our LRS wiki.

It had over 36000 pages (http://c2.com/cgi/wikiPages). Since 2014 the wiki can no longer be edited due to vandalism, but it's still online. It was originally available at http://www.c2.com/cgi/wiki, now at http://wiki.c2.com/ (sadly now requires JavaScript, WTF how is this a hacker site???).

The site's engine was kind of suckless/KISS, even Wikipedia looks bloated compared to it. It was pure unformatted HTML that used a very clever system of hyperlinks between articles: any CamelCase multiword in the text was interpreted as a link to an article, so for example the word SoftwareDevelopment was automatically a link to a page called Software Development. This presented a slight issue e.g. for single-word topics but the creativity required for overcoming the obstacle was part of the fun, for example the article on C was called CeeLanguage.

Overall the site was also very different from Wikipedia and allowed informal comments, jokes and subjective opinions in the text. It was pretty entertaining to read. There's a lot of old hacker wisdom to be found there.

There are other wikis that work in similar spirit, e.g. CommunityWiki (https://communitywiki.org, a wiki "about communities"), MeatBallWiki (http://meatballwiki.org/wiki/) or EmacsWiki.

Interesting Pages

These are some interesting pages found on the Wiki.

See Also


windows

Micro$oft Window$

Microsoft Windows is a series of malicious, bloated proprietary "operating systems". AVOID THIS SHIT.

Versions

TODO


wizard

Wizard

Wizard is a male virgin who is at least 30 years old (female virgins of such age haven't been seen yet). The word is sometimes also used for a man who's just very good with computers. These two sets mostly overlap so it rarely needs to be distinguished which meaning we intend.

There is an imageboard for wizards called wizardchan. It is alright but also kind of sucks, for example you can't share your art with others because of stupid anti-doxxing rules that don't even allow to dox yourself.

See Also


woman

<3 Woman <3

A woman (also girl, gril, gurl, femoid, wimminz or succubus; tranny girl being called t-girl, trap, femboy, fake girl or mtf) is one of two genders (sexes) of humans, the other one being man. Women are the weaker sex, they are cute (sometimes) but notoriously bad at programming, math and technology: in the field they usually "work" on bullshit (and mostly harmful) positions such as "diversity department", marketing, "HR", UI/user experience, or as a token girl for media. If they get close to actual technology, their highest "skills" are mostly limited to casual "coding" (which itself is a below-average form of programming) in a baby language such as Python, Javascript or Rust. Mostly they are just hired for quotas and make coffee for men who do the real work (until TV cameras appear). Don't let yourself be fooled by the propaganda, women have always been bad with tech.

Even mainstream science acknowledges women are dumber than men: even the extremely politically correct Wikipedia states TODAY in the article on human brain that male brain is on average larger in volume (even when corrected for the overall body size) AND that there is correlation between volume and intelligence: this undeniably implies women are dumber. On average male brain weights 10% more than woman's and has 16% more brain cells. The Guinness book of 1987 states the average male brain weight being 1424 grams and that of a female being 1242 grams; the averages both grow with time quite quickly so nowadays the numbers will be higher in both sexes, though the average of men grows faster. The heaviest recorded brain belonged to a man (2049 grams), while the lightest belonged to a woman (1096 grams). Heaviest woman brain weighted 1565 grams, only a little more than men's average. IQ/intelligence measured by various tests has been consistently significantly lower for women than for men, e.g. the paper named Sex differences in intelligence and brain size: A paradox resolved found a 4 point difference, noting that in some problems such as 3D spatial rotations males score even 11 points higher average.

Historically women have been privileged over men -- while men had to work their asses off, go to wars, explore and hunt for food, women often weren't even supposed to work, they could stay at home, chill while guarding the fire and playing with children -- this is becoming less and less so with capitalism which aims to simply enslave everyone, nowadays mostly through the feminist cult that brainwashed women to desire the same slavery as men. Statistically women live about 5 years longer lives than men because they don't have to worry and stress so much.

Women also can't drive, operate machines, they can't compare even to the worst men in sports, both physical and mental such as chess. Women have to have separate leagues and more relaxed rules, e.g. the title Woman Grand Master (WGM) in chess has far lower requirements to obtain than regular Grand Master (GM). (According to Elo rating the best woman chess player in history would have only 8% chance of winning against current best male who would have 48% chance of winning). On the International Mathematical Olympiad only 43 out of 1338 medals were obtained by females. There are too many funny cases and video compilations of women facing men in sports (watch them before they're censored lol), e.g. the infamous Vaevictis female "progaming" team or the football match between the US national women team (probably the best women team in the world) vs some random under 15 years old boy's team which of course the women team lost. LMAO there is even a video of 1 skinny boy beating 9 women in boxing. Of course there are arguments that worse performance of women in mental sports is caused culturally; women aren't led so much to playing chess, therefore there are fewer women in chess and so the probability of a good woman player appearing is lower. This may be partially true even though genetic factors seem at least equally important and it may equally be true that not so many women play chess simply because they're not naturally good at it; nevertheless the fact that women are generally worse at chess than men stands, regardless of its cause -- a randomly picked men will most likely be better at chess than a randomly picked woman, and that's what matters in the end. Also if women are displaced from chess by culture, then what is the area they are displaced to? If women are as capable as men, then for any area dominated by men there should be an area equally dominated by women, however we see that anywhere men face women men win big time, even in the woman activities such as cooking and fashion design. Feminists will say that men simply oppress women everywhere, but this just means that women are dominated by men everywhere, which means they are more skilled and capable at everything, there is no way out -- yes, antelope are oppressed by lions, but it's because lions are stronger than antelopes. Here we simply argue that women are weaker than men, not that oppressing women is okay -- it isn't. Furthermore if women were weaker but not by that much, we should statistically see at least occasional dominance by a woman, but we practically don't, it's really almost impossible to find a single such case in history, which indicates women are VERY SIGNIFICANTLY weaker, i.e. not something we negligible we could just ignore. Being a woman correlates to losing to a man almost perfectly, it is a great predictor, basically as strong as can appear in science. It makes sense from the evolutionary standpoint as well, women simply evolved to take care of children, guard fire and save resource consumption by being only as strong as necessarily required for this task, while men had to be stronger and smarter to do the hard job of providing food and protection.

Now because today's brainwashed reader will see this as "sexism", let us remind ourselves that this is completely OK. Women are weaker, but in a good society this doesn't matter as in a good society people don't have to compete or prove their usefulness, everyone is loved equally, weak or strong. The issue here is not pointing out our differences but perpetuating a shitty society.

Of course even though rare, well performing women may statistically appear (though they will practically never reach the skill of the best men). That's great, such rare specimen could do great things. The issue is such women (as all others) are very often involved with a cult such as the feminists who waste their effort on fighting men instead of focusing on study and creation of real technology, and on actually loving it. They don't see technology as a beautiful field of art and science, they see it as a battlefield, a political tool to be weaponized to achieve social status, revenge on society etc., which spoils any rare specimen of a capable woman. Even capable women can't seem to understand the pure joy of programming, the love of creation for its own sake, they think more in terms of "learning to COOODE will get me new followers on social networks" etc. Woman mentality is biologically very different from men mentality, a woman is normally not capable of true, deep and passionate love, woman only thinks in terms of benefit, golddigging etc. (which is understandable from evolutionary point of view as women had to ensure choosing a good father for their offspring); men, even if cheating, normally tend towards deep life-long love relationships, be it with women or art. You will never find a virgin basement dweller programmer or demoscene programmer of female sex which isn't a poser, a hacker who is happy existing in a world of his own programs without the need for approval or external reward, a woman will likely never be able to understand this. This seems to be evolutionary given, but perhaps in a better culture these effects could be suppressed.

Supposed "achievements" of women after circa 2010 can't be taken seriously, propaganda has started to tryhard and invent and overrate achievements and basically just steal achievements of men and hand them over to women (not that there were any significant achievement post 2010 though). There are token women inserted on soyentific positions etc. (lol just watch any recent NASA mission broadcast, there is always a woman inserted in front of the camera).

Of course, LRS loves all living beings equally, even women. In order to truly love someone we have to be aware of their true nature so that we can truly love them, despite all imperfections.

Is there even anything women are better at than men? Well, women seem for example more peaceful or at least less violent on average (feminism of course sees this as a "weakness" and tries to change it), though they seem to be e.g. more passive-aggressive. Nevertheless there have been a few successful queens in history, women can sometimes perhaps be good in representative roles (and other simple chair-sitting jobs), in being a "symbol", which doesn't require much of any skill (a statue of a god can do the same job really). They have also evolved to perform the tasks of housekeeping and care taking at which they may excel (still it seems that if men fully focus on a specific task, they will beat women, for example the best cooks in the world are men). Sometimes women may be preferable exactly for not being as "rough" as men, e.g. as singers, therapists, sex workers etc. There were also some good English female writers actually, like Agatha Christie and J. K. Rowling, though that's still pretty weak compared to Hemingway, Goethe, Tolkien, Tolstoy, Shakespeare, Dickens, Dostoevsky etcetc.

lol http://www.menarebetterthanwomen.com also https://encyclopediadramatica.online/Woman :D

How to deal with being a woman? Well, just as you deal with not being born Einstein, Phelps or Kasparov. It's fine to be anyone, there is no need to fight. Try to be a good human and live a fulfilling life, you can create a lot of good.

Men Vs Women In Numbers

Here is a comparison of men and women in numbers that are still possible to be found in already highly censored sources. Of course, the numbers aren't necessarily absolutely up to date, at the time or reading they may be slightly outdated, also keep in mind that in the future such comparisons may become much less objective due to SJW forces -- e.g. because of trans athletes in sports we may see diminishing differences between measurements of performance of men and "women" because what in the future will be called women will be just men pretending to be women.

Note: It is guaranteed that soyentific BIGBRAINS will start screeching "MISLEADING STATISTICSSSSSSS NON PEER REVIEWED". Three things: firstly chill your balls, this isn't a scientific paper, just a fun comparison of some numbers. Secondly fuck you, we don't fancy peer censorship. Thirdly we try to be benevolent and not choose stats in a biased way (we don't even have to) but it is not easy to find better statistics, e.g. one might argue it could be better to compare averages or medians rather than bests -- indeed, but it's impossible to find average performance of all women in a population in a specific sport discipline, taking the best performer is simply easier and still gives some idea. So we simply include what we have. Thirdly any statistics is a simplification and can be seen as misleading by those who dislike it.

measure men women comment
height: average (EU) 178 cm 165 cm
height: greatest 273 cm (Wadlow) 257 cm (Jinlian)
weight: average (EU) 85 kg 70 kg
weight: greatest 442 kg (Minnoch) 385 kg (Carnemoll)
brain: avg. weight 1424 g 1242 g
brain: heaviest 2049 g 1565 g
muscle/mass avg. 42% 32%
life span: avg. (EU) 75 years 81 years
life span: greatest 116 y. (Kimura) 122 y. (Calment) top 10 oldest people ever are all W
average IQ (US 1993) 101.8 98.8 in some areas M score even 11 higher
200m outdoor WR 19.90s (Bolt) 21.34s (G-Joyner) best W ranks lower than #5769 among M
60m indoor WR 6.34s (Coleman) 6.92s (Privalova) best W ranks lower than #3858 among M
raw deadlift WR 460kg (Magnusson) 305kg (Swanson) best M lifts about 50% more weight
marathon 2:01 (Kipchoge) 2:14 (Kosgei) best W ranks #3935 among men
100m swim WR 46.8s (Popovici) 51.7s (Sjostrom) best W ranks lower than #602 among M
chess best Elo 2882 (Carlsen) 2735 (Polgar) best W win 8%, lose 48%, draw 44%
go best Elo 3862 (Jinseo) 3424 (Choi Jeong) best W ranks #68, M win prob.: 92%
speedcubing WR 3.47s (Du) 4.44 (Sebastien) best W ranks #16 among M
Starcr. 2 best Elo 3556 (Serral) 2679 (Scarlett) best M has ~80% win chance against W
holding breath WR 24:37 (Sobat) 18:32m (Meyer) Ms have ~35% greater lung capacity

How To Get Women

Don't!

see also incel/volcel

Jerking off is the easiest solution to satisfying needs connected to fucking women. If you absolutely HAVE to get laid, save up for a prostitute, that's the easiest way and most importantly won't ruin your life. Or decide to become gay, that may make matters much easier. You may also potentially try to hit on some REAL ugly girl that's literally desperate for sex, but remember it has to be the ugliest, fattest landwhale that you've ever seen, it's not enough to just find a 3/10, that's still a league too high for you that will reject you unless you pay her. Also consider that if you don't pay for sex, there is a 50% chance you will randomly get sued for rape sometime during the following 30 year period. If you want a girlfriend, then rather don't. The sad truth is that to make a woman actually "love" you, as much as one is capable of doing so, you HAVE TO be an enormously evil ass that will beat her to near death, abuse her, rape her and regularly cheat on her -- that's how it is and that's what every man has to learn the hard way -- as we know, the older generation's experience cannot be communicated by words, the young generation always thinks it is somehow different and will never listen. Sadly this is simply how it is -- even if you think you have found the "special one", the one that's different, the intelligent introverted one that's nice and friendly to you, nope, she is still a woman, she won't love you unless you're a murderer dickass beating her daily (NOTE: we don't advocate any violence, our advice here is to simply avoid women). If you think getting close to her, being nice and listening to her will make her love you, you're going to hit a brick wall very hard -- this road only ever leads to a friendzone 100% of the times, you will end up carrying her purse while she's shopping without her letting you touch her ever. If you just want a nonsexual girl friend, then it's fine, but you will never make a girlfriend this way. This is not the girl's fault, she is programmed like that, blaming the girl here would be like blaming a child for overeating on candy or blaming a cat for torturing birds for fun; and remember, THE GIRL SUFFERS TOO, she is literally attracted only to those who will abuse her, it is her curse. If anyone's to blame for your suffering, it is you for being so extremely naive -- always remember you are playing with fire. You may still get a girl to stay with you or even marry you and have kids if you have something that will make her want to be with you despite not loving you, which may include being enormously rich, being so braindead to have million subscribers on YouTube, having an enormous 1 meter long dick or literally giving up all dignity and succumbing to being her lifelong slave dog doing literally everything she says when she says it, but that will still get you at most 4/10 and is probably not worth it. { From my experience this also goes for trans girls somehow, so tough luck. Maybe it's so even for gay men in the woman role. ~drummyfish } All in all rather avoid all of this and pay for a prostitute, buy some sex toys, watch porn and stay happy <3

Notable Women In History

Finding famous women capable in technology is almost a futile task. One of the most famous women of modern tech, even though more an entrepreneur than engineer, was Elizabeth Holmes who, to the feminists' dismay, turned out to be a complete fraud and is now facing criminal charges. Grace Hopper (not "grass hopper" lol) is a woman actually worth mentioning for her contribution to programming languages, though the contribution is pretty weak. Ada Lovelace cited by the feminist propaganda as the "first programmer" also didn't actually do anything besides scribbling a note about a computer completely designed by a man. This just shows how desperate the feminist attempts at finding capable women in tech are. Then there are also some individuals who just contributed to the downfall of the technology who are, in terms of gender, at least partially on the woman side, but their actual classification is actually pretty debatable -- these are monstrosities with pink hair who invented such cancer as COCs and are not even worth mentioning.

In the related field of free culture there is a notable woman, Nina Paley, that has actually done some nice things for the promotion of free culture and also standing against the pseudoleftist fascism by publishing a series of comics with a character named Jenndra Identitty, a parody of fascist trannies.

Here is a list of almost all historically notable women:

See Also


work

Work

Work is a glorification of slavery.

Work, better known as slavery, is an unpleasant effort that one is required to suffer, such as harvesting crops or debugging computer programs. Work hurts living beings and takes away the meaning of their lives, it destroys their bodies and minds. Work makes us slaves, it wastes our lives and is a cause of a large number of suicides -- many consider it better to die than to work. One of the main goals of civilization is to eliminate any need for work, i.e. create machines that will do all the work for humans (see automation).

While good society tries to eliminate work, capitalism aims for the opposite, i.e. artificially creating bullshit jobs and bullshit needs so as to keep everyone enslaved to the system. Fortunately movements such as the antiwork movement try to oppose this, however masses have already been brainwashed to be hostile to such movements and instead demand their own enslavement.

We see it as essential to start educating people about the issue as well as starting to eliminate jobs immediately with things such as automation and universal basic income.

How To Avoid Work

Here are some tips on avoiding slavery:


world_broadcast

World Broadcast

{ I don't know too much about radio transmissions, please send me a mail if something here is a complete BS. ~drummyfish }

WIP!

World (or world-wide) broadcast is a possible technological service (possibly complementing the Internet) which could be implemented in a good society and whose main idea is to broadcast generally useful information over the whole globe so that simple and/or energy saving computers could get basic information without having to perform complex and costly two-way communication.

It would work on the same principle as e.g. teletext: there would be many different radio transmitters (e.g. towers, satellites or small radios) that would constantly be broadcasting generally useful information (e.g. time or news) in a very simple format (something akin text in Morse code). Any device capable of receiving radio signal could wait for desired information (e.g. waiting for certain keyword such as TIME: or NEWS:) and then save it. The advantage would be simplicity: unlike with Internet (which would of course still exist) the device wouldn't have to communicate with anyone, there would be no servers communicating with the devices, there would be no communication protocols, no complex code, no DDOS-like overloading of servers, and the receiving devices wouldn't waste energy (as transmitting a signal requires significant energy compared to receiving it -- like shouting vs just listening). It would also be more widely available than Internet connection, e.g. in deserts.

See Also


wow

World Of Warcraft

World of Warcraft (WoW) is an AAA proprietary game released in 2004 by Blizzard that was one of the most successful and influencing games among MMORPGs.

There is a FOSS implementation of WoW server called MaNGOS that's used to make private servers. The client is of course proprietary and if you dare make a popular server Blizzard will kill your grandmother and rape your children.

{ The classic WoW (some time until the end of WOTLK) lied somewhere in the middle between good old and shitty modern games (the WoW of today is 100% shit of course). For me the peak of Warcraft was Warcraft III:TFT, it was perfect in every way (except for being proprietary and bloated of course). As a great fan of Warcraft III, seeing WoW in screenshots my fantasy made it the best game possible to be created. When I actually got to playing it it was really good -- some of my best memories come from that time -- nevertheless I also remember being disappointed in many ways. Especially with limitation of freedom (soulbound items, forced grinding, effective linearity of leveling, GMs preventing hacking the game in fun ways etc.) and here and there a lack of polish (there were literally visible unfinished parts of the map, also visual transitions between zones too fast and ugly and the overall world design felt kind of bad), laziness and repetitiveness of the design. I knew how the game could be fixed, however I also knew it would never be fixed as it was in hands of a corporation that had other plans with it. That was the time I slowly started to see things not being ideal and the possibility of a great thing going to shit. ~drummyfish }


www

World Wide Web

Want to make your own small website? See our how to.

World Wide Web (www or just the web) is (or was -- by 2023 mainstream web is dead) a network of interconnected documents on the Internet, which we call websites or webpages. Webpages are normally written in the HTML language and can refer to each other by hyperlinks ("clickable" links right in the text). The web itself works on top of the HTTP protocol which says how clients and servers communicate. Some people confuse the web with the Internet, but of course those people are retarded: web is just one of many so called services existing on the Internet (other ones being e.g. email or torrents). In order to browse the web you need an Internet connection and a web browser.

{ How to browse the web in the age of shit? Currently my "workflow" is following: I use the badwolf browser (a super suckless, very fast from-scratch browser that allows turning JavaScript on/off, i.e. I mostly browse small web without JS but can still do banking etc.) with a CUSTOM START PAGE that I completely own and which only changes when I want it to -- this start page is just my own tiny HTML on my disk that has links to my favorite sites (which serves as my suckless "bookmark" system) AND a number of search bars for different search engines (Google, Duckduckgo, Yandex, wiby, Searx, marginalia, Right Dao, ...). This is important as nowadays you mustn't rely on Google or any other single search engine -- I just use whichever engine I deem best for my request at any given time. ~drummyfish }

An important part of the web is also searching its vast amounts of information with search engines such as the infamous Google engine. It also relies on systems such as DNS.

Mainstream web is now EXTREMELY bloated and practically unusable, for more suckless alternatives see gopher and gemini. See also smol web.

The web used to be perhaps the greatest part of the web, the thing that made Internet widespread, however it quickly deteriorated by capitalist mainstreamization and commercialization and by now, in 2020s, it is one of the most illustrative, depressing and most hilarious examples of capitalist bloat. A nice article about the issue, called The Website Obesity Crisis, can be found at https://idlewords.com/talks/website_obesity.htm. There is a tool for measuring a website bloat at https://www.webbloatscore.com/: it computes the ratio of the page size to the size of its screenshot (e.g. YouTube currently scores 35.7).

How It Went To Shit

{ As of 2023 my 8GB RAM computer with multiple 2+ GHz CPUs has serious issues browsing the "modern" web, i.e. it is sweating on basically just displaying a formatted text, which if done right is quite comfortably possible to do on a computer with 100000 times lower hardware specs! In fact orders of magnitude weaker computers could browse the web much faster 20 years ago. Just think about how deeply fucked up this is: the world's foremost information highway and "marvel of technology" has been raped by capitalist soydevs so much that it is hundreds of thousands times less efficient than it should be, AND it wouldn't even require much effort to make it so. Imagine your car consuming 100000 litres of gasoline instead of 1 or your house leaking 99999 litres of water for any 1 litre of water you use. This is the absolute state of dystopian capitalist society. ~drummyfish }

 ________________________________________________________________________
|                                  |   |       |  |                      |
|   ENLARGE PENIS WITH SNAKE OIL   |   |  CSS  |  | Video AD             |
|__________________________________|   |       |  |  CONSOOOOOOOOOOOOO   |
| U.S. PRESIDENT ASSASINATED           |  BUG  |  |   OOOOOOOOOOOM BICH  |
|                                      |       |  |______________________|
| Article unavailable in your country. |  LOL  |                         |
|   ___________________________________|       |___     Prove you're a   |
|  |                                   |_______|   |    human, click all |
|  |     We deeply care about your privacy <3      |    images of type 2 |
|  |                                               |    quasars.         |
|  | Will you allow us to use cookies for spying?  |    [*] [*] [*] [*]  |
|  |         _____               ______            |    [*] [*] [*] [*]  |
|  |        | YES |             |  OK  |           |   _________________ |
|  |         """""               """"""            |_ | FUCK MATURE MOMS||
|  |_______________________________________________| || IN 127.0.0.1    ||
|     |                                              || CHAT NOW !!!    ||
|     |  Your browser is 2 days old, please update   ||                 ||
|     |  to newest version to view this site.        ||*30 NEW MESSAGES*||
|_____|______________________________________________||_________________||

A typical website under capitalism, 2023. For potential far-future readers: this is NOT exaggeration, all websites LITERALLY look like this.

Back in the days (90s and early 2000s) web used to be a place of freedom working more or less in a decentralized manner, on anarchist and often even communist principles --⁠ people used to have their own unique websites where they shared freely and openly, censorship was difficult to implement and mostly non-existent and websites used to have a much better design, were KISS, safer, "open" (no paywalls, registration walls, country blocks, DRM, ...), MUCH faster and more robust as they were pure HTML documents. It was also the case that most websites were truly nice, useful and each one had a "soul" as they were usually made by passionate nerds who had a creative freedom and true desires to create a nice website (yes, even if they were making a commercial website for some company).

As the time marched on web used to become more and more shit, as is the case with everything touched by capitalist hand -- the advent of so called web 2.0 brought about a lot of complexity, websites started to incorporate client-side scripts (JavaScript, Flash, Java applets, ...) which led to many negative things such as incompatibility with browsers (kickstarting browser consumerism and update culture), performance loss and security vulnerabilities (web pages now became Turing complete programs rather than mere documents) and more complexity in web browsers, which leads to immense bloat and browser monopolies (greater effort is needed to develop a browser, making it a privilege of those who can afford it, and those can subsequently dictate de-facto standards that further strengthen their monopolies). Another disaster came with social networks in mid 2000s, most notably Facebook but also YouTube, Twitter and others, which centralized the web and rid people of control. Out of comfort people stopped creating and hosting own websites and rather created a page on Facebook. This gave the power to corporations and allowed mass-surveillance, mass-censorship and propaganda brainwashing. As the web became more and more popular, corporations and governments started to take more control over it, creating technologies and laws to make it less free. By 2020, the good old web is but a memory and a hobby of a few boomers, everything is controlled by corporations, infected with billions of unbearable ads, DRM, malware (trackers, crypto miners, ...), there exist no good web browsers, web pages now REQUIRE JavaScript even if it's not needed in principle due to which they are painfully slow and buggy, there are restrictive laws and censorship and de-facto laws (site policies) put in place by corporations controlling the web.

Mainstream web is quite literally unusable nowadays. { 2023 update: whole web is now behind cuckflare plus secure HTTPS safety privacy antipedophile science encrypted privacy antiterrorist democratic safety privacy security expert antiracist sandboxed protection and therefore literally can't be used. Also Google has been absolutely destroyed by the LLM AIs now. ~drummyfish } What people searched for on the web they now search on on a handful of platforms like Facebook and YouTube (often not even using a web browser but rather a mobile "app"); if you try to "google" something, what you get is just a list of unusable sites written by AIs that load for several minutes (unless you have the latest 1024 TB RAM beast) and won't let you read beyond the first paragraph without registration. These sites are uplifted by SEO for pure commercial reasons, they contain no useful information, just ads. Useful sites are buried under several millions of unusable results or downright censored for political reasons (e.g. using some forbidden word). Thankfully you can still try to browse the smol web with search engines such as wiby, but still that only gives a glimpse of what the good old web used to be.

History

World Wide Web was invented by an English computer scientist Tim Berners-Lee. In 1980 he employed hyperlinks in a notebook program called ENQUIRE, he saw the idea was good. On March 12 1989 he was working at CERN where he proposed a system called "web" that would use hypertext to link documents (the term hypertext was already around). He also considered the name Mesh but settled on World Wide Web eventually. He started to implement the system with a few other people. At the end of 1990 they already had implemented the HTTP protocol for client-server communication, the HTML, language for writing websites, the first web server and the first web browser called WorldWideWeb. They set up the first website http://info.cern.ch that contained information about the project.

In 1993 CERN made the web public domain, free for anyone without any licensing requirements. The main reason was to gain advantage over competing systems such as Gopher that were proprietary. By 1994 there were over 500 web servers around the world. WWW Consortium (W3M) was established to maintain standards for the web. A number of new browsers were written such as the text-only Lynx, but the proprietary Netscape Navigator would go to become the most popular one until Micro$oft's Internet Explorer (see browser wars). In 1997 Google search engine appeared, as well as CSS. There was a economic bubble connected to the explosion of the Web called the dot-comm boom.

Between 2000 and 2010 there used to be a mobile alternative to the web called WAP. Back then mobile phones were significantly weaker than PCs so the whole protocol was simplified, e.g. it had a special markup language called WML instead of HTML. But as the phones got more powerful they simply started to support normal web and WAP disappeared.

Around 2005, the time when YouTube, Twitter, Facebook and other shit sites started to appear and become popular, so called Web 2.0 started to form. This was a shift in the web's paradigm towards more shittiness such as more JavaScript, bloat, interactivity, websites as programs, Flash, social networks etc. This would be the beginning of the web's downfall.

How It Works

It's all pretty well known, but in case you're a nub...

Users browse the Internet using web browsers, programs made specifically for this purpose. Pages on the Internet are addressed by their URL, a kind of textual address such as http://www.mysite.org/somefile.html. This address is entered into the web browser, the browser retrieves it and displays it.

A webpage can contain text, pictures, graphics and nowadays even other media like video, audio and even programs that run in the browser. Most importantly webpages are hypertext, i.e. they may contain clickable references to other pages -- clicking a link immediately opens the linked page.

The page itself is written in HTML language (not really a programming, more like a file format), a relatively simple language that allows specifying the structure of the text (headings, paragraphs, lists, ...), inserting links, images etc. In newer browsers there are additionally two more important languages that are used with websites (they can be embedded into the HTML file or come in separate files): CSS which allows specifying the look of the page (e.g. text and font color, background images, position of individual elements etc.) and JavaScript which can be used to embed scripts (small programs) into webpages which will run on the user's computer (in the browser). These languages combined make it possible to make websites do almost anything, even display advanced 3D graphics, play movies etc. However, it's all huge bloat, it's pretty slow and also dangerous, it was better when webpages used to be HTML only.

The webpages are stored on web servers, i.e. computers specialized on listening for requests and sending back requested webpages. If someone wants to create a website, he needs a server to host it on, so called hosting. This can be done by setting up one's own server -- so called self hosting -- but nowadays it's more comfortable to buy a hosting service from some company, e.g. a VPS. For running a website you'll also want to buy a web domain (like mydomain.com), i.e. the base part of the textual address of your site (there exist free hosting sites that even come with free domains if you're not picky, just search...).

When a user enters a URL of a page into the browser, the following happens (it's kind of simplified, there are caches etc.):

  1. The domain name (e.g. www.mysite.org) is converted into an IP address of the server the site is hosted on. This is done by asking a DNS server -- these are special servers that hold the database mapping domain names to IP addresses (when you buy a domain, you can edit its record in this database to make it point to whatever address you want).
  2. The browser sends a request for given page to the IP address of the server. This is done via HTTP (or HTTPS in the encrypted case) protocol (that's the http:// or https:// in front of the domain name) -- this protocol is a language via which web servers and clients talk (besides websites it can communicate additional data like passwords entered on the site, cookies etc.). (If the encrypted HTTPS protocol is used, encryption is performed with asymmetric cryptography using the server's public key whose digital signature additionally needs to be checked with some certificate authority.) This request is delivered to the server by the mechanisms and lower network layers of the Internet, typically TCP/IP.
  3. The server receives the request and sends back the webpage embedded again in an HTTP response, along with other data such as the error/success code.
  4. Client browser receives the page and displays it. If the page contains additional resources that are needed for displaying the page, such as images, they are automatically retrieved the same way (of course things like caching may be employed so that they same image doesn't have to be readownloaded literally every time).

Cookies, small files that sites can store in the user's browser, are used on the web to implement stateful behavior (e.g. remembering if the user is signed in on a forum). However cookies can also be abused for tracking users, so they can be turned off.

Other programming languages such as PHP can also be used on the web, but they are used for server-side programming, i.e. they don't run in the web browser but on the server and somehow generate and modify the sites for each request specifically. This makes it possible to create dynamic pages such as search engines or social networks.

See Also


x86

x86

x86 is a bloated, toxic instruction set architecture (or rather a family of them) used mostly in the desktop computers -- it is the most widely used architecture, used in Intel and AMD CPUs.

It is a CISC architecture and boy, complex it is. LMAO there are instructions like PCLMULQDQ, MPSADBW (multiple packed sums of absolute difference which does something like cross correlation ??? xD) and PCMPESTRI (which does like many possible string searches/comparisons on strings of different data types like subset or substring with many different options). Basically if you smash your keyboard chances are you produce a valid x86 instruction xD


xd

xD


xonotic

Xonotic

Xonotic was a free as in freedom fast multiplayer arena first-person-shooter game similar to e.g. Quake. It runs on GNU/Linux, Winshit, BSD and other systems. It is one of the best libre games, i.e. games that are completely free by both code and data/content. It is available under GPLv3. Its gameplay, graphics and customizability are pretty great, it may well be the best in the AFPS genre, even compared to AAA proprietary games -- this kind of quality is very rare among libre/noncommercial games.

UPDATE: Xonotic as a game died in summer 2023 when the retarded developers couldn't get an erection without CONSOOMING NEW CONTINT and started just blindly pushing bugs and balance breaking changes without listening to player complaints at all (and actually banning many for voicing criticism, including drummyfish), demonstrating update culture at its worst. The main server became a meme overnight, all fun (such as friendly push) was removed from the game to create a "safe space" for players, possibly to get the game ready for Steam sales or make it more open to noobs and advertisers. RIP. If you can fork Xonotic and restore it to its previous glory, please do so.

{ LOL the main server now banned me for voicing criticism of these updates xD RIP, I guess that's it for me. It was a nice journey. ~drummyfish }

{ I've been playing Xonotic for years, it's really an excellent game. I've met a lot of nice people there as the players are usually programmers and people looking for FOSS. The gameplay is addictive and relaxing and you can have a great chat during the game. Of course it's kind of bloated but Xonotic is a masterpiece. ~drummyfish }

The game builds on old ideas and mechanics but adds new weapons, mechanics, ideas and modes -- apart from the traditional deathmatch, team deathmatch, capture the flag, complete the stage (defrag, racing without shooting) and competitive mode (duel), there are a number of new fun modes such as clan arena (team round-based mode without self-damage and items), freeze tag, key hunt, last man standing and even Nexball -- football in Xonotic!

Xonotic was forked from a game called Nexuiz after a trademark controversy (basically in 2010 the guy who started the project and abandoned it later, an ass called Lee Vermeulen, came back and secretly sold the trademark to some shit company named Illfonic). Nexuiz itself was created on top of liberated Quake 1 engine, so Xonotic still bears a lot Quake's legacy, however it masterfully expands on its core principles and makes the gameplay even better. For example rockets shot by rocket launcher can be guided with mouse while holding down the left button which adds a new skill element. New types of weapons were added to the classic AFPS weapons (e.g. the infamous electro, a meme spamming weapon used by noobs for its forgiveness of lack of skill). Movement physics was also modified to give better air control and faster movement as a result.

Fun fact: Xonotic even briefly appeared on a TV show: https://yewtu.be/search/watch?v=a0TFejn95Sw.

The game's modified Quake 1 (YES, 1) engine is called Darkplaces. It can be highly customized and modded. Just like in other Quake engine games, there are many console commands (e.g. cvars) to alter almost anything about the game. Advanced programming can be done using QuakeC. Maps can be created e.g. with netradiant.

Though compared to any mainstream modern games Xonotic is quite nicely written (e.g. runs very fast, doesn't have billions of dependencies and despite not being 1.0 yet has fewer bugs than today's AAA games at release), from a more strict point of view it's still very bloated and it's known to contain some shitcode -- for example the engine has frame dependent physics (TODO: cite a specific line in code), uses floating point to represent time, great part of the game is written in a joke language called QuakeC, its net code is worse than e.g. that of OpenArena and some people complain about input lag and other bugs. It could definitely be written MUCH better, but as already mentioned it's still a million times better than any new game.

{ The game runs extremely smooth for me even on old PC, I have no input lag. When my Internet connection gets bad I am sometimes unable to play Xonotic but still able to play OpenArena, which says something about the net code, but that happens very rarely. Also on pretty rare occasions I notice bugs such as imperfect culling of players or even projectiles just hanging mid air during whole game (which happens after heavy packet loss) etc., but nothing that would really be so frequent as to bother me. ~drummyfish }

Xonotic is similar to other libre AFPS games such as OpenArena and Red Eclipse. Of these Xonotic clearly looks the most professional, it has the best "graphics" in the modern sense but still offers the option to turn all the fanciness off. OpenArena, based on Quake 3 engine, is simpler, both technologically and in gameplay, and its movement is slower and with different physics. While OpenArena just basically clones Quake 3, Xonotic is more of an actually new and original game with new ideas and style. Similar thing could be said about Red Eclipse, however it's not as polished and shows some infection with SJW poison.

{ OpenArena is great too. ~drummyfish }

As of 2022 the game has a small but pretty active community of regular players, centered mostly in Europe, though there is some US scene too. There are regulars playing every day, pros, noobs, famous spammers, campers and trolls. Nice conversations can be had during games. There are memes and inside jokes. The community is pretty neat. Xonotic also has a very dedicated defrag ("racing with no shooting") community. There have also been a few small tournament with real cash prizes in Xonotic.

The GOAT of Xonotic is probably Dodger, his skill was just too high even above other pros. The worst player in Xonotic and probably also the biggest idiot on the planet is a player named 111.

Great news is the development and main servers have so far not been infected by the SJW poison and (as of 2023) allow a great amount of free speech -- another rarity. Even the game itself contains speech that SJWs would consider "offensive", there are e.g. voice lines calling other players "pussies" and "retards". This is great.

Tips'N'Tricks

Here are some pro tips to git gud, impress your frens and generally have more fun.

See Also


xxiivv

XXIIVV

{ Still researching this shit etc. ~drummyfish }

XXIIVV is a website and personal wiki (similar concept to our wiki) of a Canadian minimalist/esoteric programmer/artist/generalist David Mondou-Labbe who calls himself "Devine Lu Linvega" (lol) who is a part of an artist/programmer group called Hundred Rabbits who live on a ship. The site is accessible at http://wiki.xxiivv.com/site/home.html. There are some real good and pretty bad things about it.

Firstly let's see the letdowns: HE LICENSES HIS ART AND SOME OF HIS CODE UNDER CC-BY-NC-SA -- retard alert!. Honestly he can shove this up his ass. He's an open soars fanboy. At least some of his code is MIT, but he also makes fucking PROPRIETARY PAID software (e.g. Verreciel). The guy also seems egoistic as fuck, invents weird hipster names and "personal pronouns" for himself, has some ugly body modifications, wears cringe rabbit costumes, he thinks his art is so good he has to "protect" it with fascist licenses and writes in a cringe pompous/cryptic style probably in hopes to appear smart while just making it shithard to make sense of his texts. The only thing he's missing is a fedora. Anyway, that's just a quick sum up of the cancer stuff.

There are also nice things though, a few of them being:


yes_they_can

Yes They Can

If you think they can't do something, you are wrong; unless it is directly violating a law of physics, they can do it. For example you may think "haha they can't start selling air, people would revolt", "hahaaa, they can't make people believe 1 + 1 equals 2000, it's too obvious of a lie" or "hahaaa they can't lie about history when there is a ton of direct evidence for the contrary freely accessible on the internet, they can't censor something that's all over the internet and in billions of books" -- yes, they can do all of this. With enough capitalism you can make people believe a circle to be square, they have already made people desire their everyday torture, they made people believe colors and offensive statistics are just culturally constructed hallucinations, feminists have already made people widely believe a woman can beat an adult man -- a naive man of the past would likely believe this to be impossible as well. Well, we see under capitalism it is quite possible.


youtube

YouTube

YouTube (also JewTube) is a huge, censored proprietary capitalist video consuming "website"/platform, since 2006 seized by the Google terrorist organization. It has become the monopoly "video content platform", everyone uploads his videos there and so everyone is forced to use that shitty site from time to time to view some tutorial or whatnot. YouTube is based on content consumerism, aggressive predatory marketing, copyright trolling, propaganda and general abuse of its useds -- it is financed from surveillance-powered ads as well as sponsor propaganda inserted into videos. Alternatives to YouTube, such as bitchute, the "rightist" YouTube, never really caught on very much -- YouTube is sadly synonymous with online videos just as Google is synonymous with searching the web. This is of course extremely, extremely, extremely, extremely bad.

{ https://www.vidlii.com seems alright though, at least as a curiosity. Anyway if you need to watch YouTube, do not use their website, it's shitty as hell and you will die of ad cancer, rather use something like invidious or youtube-dl. ~drummyfish }

A FOSS alternative to YouTube is e.g. PeerTube, a federated video platform, however for intended use it requres JavaScript and there are other issues that make it unusable (SJW censorship, videos load extremely slowly, ...). There also exist alternative YouTube frontends (normally also FOSS), e.g. HookTube, Invidious or FreeTube -- these let you access YouTube's videos via less bloated and more privacy-friendly interface. More hardcore people use CLI tools such as youtube-dl to directy download the videos and watch them in native players.

In November 2021 YouTube removed the dislike count on videos so as to make it impossible to express dislike or disagreement with their propaganda as people naturally started to dislike the exponentially skyrocketing shit and immorality of content corporations and SJWs started to force promote on YouTube (such as that infamous Lord of the Rings series with "afro american" dwarves that got like a billion dislikes lmao). In other words capitalism has made it to the stage of banning disagreement when people started to dislike the horse shit they're being force fed. This was met with a wave of universal criticism but of course YouTube told people to shut up and keep consuming that horse excrement -- of course zoomers are just brainless zombies dependent on YouTube like a street whore on heroin so they just accepted that recommendation. Orwell would definitely be happy to see this.

LMAO, as of 2022 YouTube has become the kingdom of clickbait. If you're living in the future and haven't seen this, you wouldn't believe how ridiculously fucked up it is, the platform is quite literally UNUSABLE (but it's still consumable and addictive to idiots so it works) -- ALL videos, including (and especially) those of the "serious" YouTubers (like mr. Veritasium), put absolutely misleading and downright made up lying thumbnails and titles, the videos have zero correlation with how they're presented. Additionally the majority of thumbnails has to be occupied by some femoid's breasts, even "educational science videos", so as to force children to click it and masturbate (YouTube can really be seen as a soft porn site now). Everyone has to do it because everyone does it. In other words capitalism is once again working as expected. Yeah and also ALL videos include sponsored content, again even the "educational science videos" that children are forced to watch at schools etc. -- this is in addition to "normal" ads that randomly play during the videos.

A typical 2022 YouTube video now looks like this:

YouTube is also a copyright dictatorship, anyone can take down any video containing even the slightest clip from a video he uploaded, even if such use would legally be allowed under fair use and even if that user doesn't have any copyright to enforce (YouTube simply supposes that whoever uploads a video to their site first must have created that video as a whole and holds a godlike power over it), i.e. YouTube is de facto making its own copyright laws which are much more strict that they are in real life (which is hard to imagine but they managed to do it). In other words there is a corporation that makes laws which effectively are basically just like normal laws except they don't even pass any law making process, their evaluation doesn't pass through justice system (courts), and the sole purpose of these laws is to rape people for money that goes solely to pay for YouTube CEO's whores and private jets. A reader in the future probably won't believe it, but there are even people who say that "this is OK" because, quote, I shit you not, """they're a private company so they can do whatever they want""". Yes, such arguments have come out of some lifeform's mouth. That probably implies a negative IQ.


zen

Zen

Zen, a term coming from zen Buddhism (the word itself from dhyana, meaning meditation), means emphasis on mediation that leads to enlightenment; in a wider sense it hints on related things and feelings such as tranquility, spiritual peace, sudden coming to realization and understanding. In hacker speech zen is a very frequent term, according to Jargon file "to zen" means to understand something by simply meditating about it or by sudden realization (as opposed to e.g. active trial and error).

TODO

See Also


zero

Zero

Zero (0) is a number signifying the absence of a thing we count. Among integers it precedes 1 and follows -1.

Some properties of and facts about this number follow:

Dividing by zero is not defined, it is a forbidden operation mainly because it breaks equations (allowing dividing by zero would also allow us to make basically any equation hold, even those that normally don't). In programming dividing by zero typically causes an error, crash of a program or an exception. In some programming languages floating point division by zero results in infinity or NaN. When operating with limits, we can handle divisions by zero in a special way (find out what value an expression approaches if we get infinitely close to dividing by 0).

See Also


zuckerberg

Mark Zuckerberg

Zuckerberg is one of the ugliest aliens ever recorded on video (yes, including the alien autopsy).