Hacker News new | past | comments | ask | show | jobs | submit login
Structuralism as a Philosophy of Mathematics (infinitelymore.xyz)
127 points by FillMaths 7 months ago | hide | past | favorite | 71 comments



I’m not sure what a mathematical approach _without_ structuralism would look like. Like, if you consider the operations created by combining the rotations and flips of a triangle, and the set of permutations of the letters A,B and C, it’s pretty obvious they’re isomorphic and also obvious that they’re different. My question is: is there a mathematically useful way of expressing that difference?

Or to put it a different way, I’m not sure anything interesting is being said here.


Well, given that structuralism as a program didn't exist until basically the 19th century, you can look at the entire history of mathematics to see what mathematics without structuralism looks like. It's sort of hard to argue that the isomorphism between symmetries and permutations is "obvious" when group theory wasn't invented until the 18th century, and symmetry groups weren't formalized until the 19th century. Your comment is basically this:

One fish says to another fish: "The water's nice today." and swims off, the other fish says "What's water?". Your entire mathematical world view is so permeated with the language of structuralism that you can't see it any more.


I think sometimes obvious concepts like symmetries are hard to distill into the appropriate mathematical language. I’m willing to bet the isomorphism there is obvious to most folks, but the expression of its mathematical essence is not.


This is really an important point for mathematical philosophy. It is a single enterprise going back to ancient Babylon, China, India, Greece, Egypt, and before. The elementary meta-theory needs to be syntonic to mathematical activity and knowledge predating (or otherwise practiced without) formalism, structuralism, categoricity, platonism, etc.


The point is that if you are interested in the structure of finite sets, then there are structure-preserving isomorphisms between the vertices of a triangle and the set {A, B, C} so that (for example) permutations of the set and reflections/rotations of the triangle are coincident.

But if you're interested in the structure of plane geometry then there is no such structure-preserving isomorphism because any map into a finite set will "delete" information about side length and angles.

The structuralist idea is that interesting mathematics can be "most easily" found by considering structure-preserving isomorphisms and not the structures themselves. In particular dealing with the structures directly can obscure the mathematics you are trying to discover, e.g. dealing with a full Euclidean group when the dihedral group is all the problem requires.


If you want, it's even possible to fully commit to the isomorphism point of view by saying that you'll represent the structures themselves by their identity isomorphisms, but then if you "delete" the little tag telling you which particular endo was the identity (would a physicist say "up to phase"?), you might discover other interesting things...


There are two kinds of mathematicians, or one can say, two cultures of mathematics: (a) problem solving, proving conjectures, etc (b) theories. Alexander Grothendieck belongs to (b). Paul Erdos to (a).

You can read Gower's paper on "The two cultures of Mathematics" at: https://www.dpmms.cam.ac.uk/~wtg10/2cultures.pdf


Mathematical structuralism was developed to explain the ontology of mathematical objects, and is often contrasted with Platonism, which is the position that numbers are real things like you or I.


A funny insight about Platonism (if it's not funny, treat this as a bad joke)

I think aka "∅" therefore I know I thought aka "{∅}" therefore I know I knew I have thought aka "{{∅}}"

and ...

boom! The entire Math system is imported. (BTW, limitations known as "computation theory" is also introduced)

So maybe we are actually mathematical being on a manifold named as "real world". To me it is more concise and profound than "philosophy". As we are real, so do all mathematical objects.


The working mathematician is a Platonist on weekdays, a formalist on weekends.

Reuben Hersh


I'm not real. I'm not sure about you.


Found the co-solipsist.


The problem is: do numbers exist? One group says, they don't. Another say, they do exist in the Platonic world, but this Platonic world is accessible from the world we inhabit in. Next time, look at the folks who looks for Platonic love:)


Lawvere theories should be fine as well:

"The rough idea is to define an algebraic theory as a category with finite products and possessing a “generic algebra” (e.g., a generic group), and then define a model of that theory (e.g., a group) as a product-preserving functor out of that category."

https://ncatlab.org/nlab/show/Lawvere+theory#the_theory_of_g...



So for example, do programs 1 and 2 compute the same thing? In general this is quite difficult and unsolvable. But then, if so, is the complexity of 1 less than 2? What is the minimum complexity of any program that computes the thing? These are relevant problems we want to solve. They involve considering isomorphism, orderings, limits, etc. - structuralism - but in many cases the maps are not there to support it, or there is no way to find them. The relationship between two complexity classes, or even two binaries.

At an elementary level what we deal with are representations. In CS this problem is severe, for example almost all boolean functions have exponential circuit complexity - but we cannot offer even a single example up. It's an open question how powerful graph isomorphism even is.

I don't mean to knock the structuralist insight, it is a powerful "imperative" as the article says. There is just incompleteness everywhere, in less mature fields especially, where they work in spaces that can barely be classified as of yet. The knowledge is generated there, and then organized within a higher scheme. Topology sprung partly out of Euler's attack on the Königsberg bridge problem (I suppose structuralism overall does to). It is a revealing perspective, but it seems the insight comes after the large amount of work, not usually before.


"Categoricity is central to structuralism because it shows that the essence of our familiar mathematical domains, including ℕ, ℤ, ℚ, ℝ, ℂ, and so on, are determined by structural features that we can identify and express."

I do not buy this. I feel it is the other way around. Structural Features are essential. Categoricity may be nice to have but why should I care so much about it?


I think you are onto something. Study mathematical objects and relations among these objects, without bringing in Categoricity. That's what Structuralism in Philosophy of Sciences is: "In the philosophy of science, structuralism[α] (also known as scientific structuralism or as the structuralistic theory-concept) asserts that all aspects of reality are best understood in terms of empirical scientific constructs of entities and their relations, rather than in terms of concrete entities in themselves" [1]

[1] https://en.wikipedia.org/wiki/Structuralism_(philosophy_of_s...


The fact that Peano Arithmetic isn't categorical, and that if you want to nail it down you need either bigger theories (within which you can of course nail down a single model) or move to theories which don't have proof theories like second order arithmetic, suggests all might not be as easy as it looks here.

Being an ex-proof theorist, I'm a bit dubious of not having one.


What does it mean that Peano Arithmetic isn't categorical?


It has (lots of) non-standard models. Or, to put it another way, there are statements that can neither be proved or disproved. This extends to any theory containing "enough" arithmetic and certainly any containing Peano Arithmetic. So ZF set theory is also incomplete in that sense.

You can "solve" this in Second Order logic, because you have a more powerful induction axiom, but how exactly you define that logic is tricky. There's no proof system that defines it completely, so you have to do this via a semantics that relies on knowing which models are or are not OK.

I don't think it solves the problem that you can't define all the "truths" (as most logicians would put it) of Peano Arithmetic.


Pure terminology. In some areas "Peano Arithmetic" is short for "first-order Peano arithmetic". In first-order logic, Peano's induction axiom can't be properly formalized, which means the resulting theory isn't categorical. For other people "Peano arithmetic" just describes the usual Peano axioms in natural language, which includes the proper induction axiom. Which can only be formalized by using at least second-order logic. That theory is categorical.


The quote says why you should care about it. Without categoricity the axioms of a theory don't define a specific structure.


So what? We can still only prove those theorems that also hold in all of those other structures that we don't mean, but sweep up anyway.


It's not clear what "the structure we mean" means without a way of definitely identifying it in our mind. Categorical axioms just make this explicit.


Yes, that's true. I think it is good to have a logic with a semantics which allows you to clearly say what the "standard models" are. For those models categoricity should be achievable. But at the same time it is also clear that there will always be other models, for which categoricity doesn't hold.


Categoricity isn't a property of a model, it's a property of a set of axioms which only have "one model up to isomorphism", or one "structure" in the sense of structuralism.


As commonly defined, you are right, but if you have the ability to select among your models, then you can use that to sharpen your notion of categoricity by considering only a certain class of models instead of all models.

You can say for example, categorical with respect to a certain cardinality of the model, that is you are only considering models of that cardinality.

And just like that, if your semantic allows it, you can also say categorical with respect to standard models. But for that you of course need a clear definition of what "standard" means. First-order logic doesn't provide such a clear definition.


Categories are the main and most developed tool building around relations that are the building blocks of structure.


Is there any attempt to organize mathematics kind of like the OSI model? On the bottom we might start with the physical layer that can be paper+pencil marks on it or a soundwave, the next could be a shape, next could be what the shape stands for and so on. Further up we could find things like isomorphisms.


Yes, typically the base layer is set theory (with ZFC being the "standard" formulation).

https://en.wikipedia.org/wiki/Foundations_of_mathematics


There are other foundations for mathematics. Dependent Type Theory has a lot of support from people studying Proof Theory. It is based on typed lambda calculus. It is currently being researched and implemented in mathematical provers, like Coq and Lean.


I understand what you mean, but in my context I would say that you need layers below this. You for example paper, ink, glyphs, and other things before you can start defining ZFC.


You don't need paper or ink to start working with math. You don't need symbols. You can just do it in your head. That's the beauty of math. The physical instantiation of it doesn't matter.


True. But the physical instantiation is necessary. That is what I am exploring: What is necessary to start working with math, and can it be described like the OSI model?


Discreteness, finitenes, causality.


If you want to see an organization of mathematical ideas, I'd recommend digging into https://us.metamath.org/. Great intro to formal systems, though many more layers of definitions than the OSI model.

Though, there may be canonical structures, any universal structure is illusive if not non-existent.


Or start with the morphisms, then look for the idempotents, so further up the structures fall out naturally?

(this program may have an advantage in that it motivates passing from the continuous to the discrete?)


In a sense computer and electrical engineers/scientists did largely map out the base layers, over ~150 years from the 19th century through the mid 20th. I think equipment (broadly speaking) is foundational for mathematics. The "stack", as far as I interpret it, is something like

1. Being

2. Communication

3. Equipment -- a device you can put marks on and read off of

4. Discipline -- ability to reliably and skillfully manipulate the device

5. Submission

The stage is set at this point for some "elementary mathematics" -- think back to elementary school.

6. Symbolism -- the equipment is not just equipment. Mathematical relevance springs here.

7. Geometry -- from vision we see area and edges, objects of perception, they are interpreted as mathematically relevant and hence symbolized.

8. Algebra -- manipulating our equipment with discipline, an equivalence is perceived between different sequences of operations.

9. Proposition -- conviction the relevant facts of geometry and algebra can be formulated clearly in declarations of the sort "if ... then ..., and ... (... and etc)".

Higher level mathematics

10. Refinement 11. Proof

12. Application 13. Theory

14. Computer science, engineering, and design

...


The computer and computer science is, in a sense, the result of the somewhat failed attempt to put mathematics on a solid logical grounds, in a way that resembles a stack like OSI. Gödel's incompleteness theorems and Turing undecidability results showed that it's not that easy. Briefly, the proof for the halting problem was the Turing machine.


From his comments:

> In my view, there is a way of viewing structuralism as undermining the success of the indispensibility argument, because no particular mathematical structure is ever indispensible, since it can be interpreted via alternative structure

I can see that, but on the other hand that alternative structure is equivalent up to isomorphism so you haven't really eliminated the structure. At best, you've probably shown that alternate theories that explain the same observations necessarily exist because our knowledge of reality's structure is incomplete and so we can't eliminate incorrect theories by the extra structure beyond the isomorphism. It does not at all undermine the idea that reality itself has some kind of structure.


Could you please briefly explain the indispensibility argument? (this being the first time I've run across the term; although if it's some philosophical thing I'm afraid I'm only interested in it as far as I'd be in the theological creation of the universe*)

My view of the relation between maths and physics is set out in https://news.ycombinator.com/item?id=39220159 , and because we go from physics to maths (f) and then back from maths to physics (f^-1), we can conjugate with any g,g^-1 pair, meaning that on the maths side we might as well mod out by isomorphics (only care about the partial order of equivalence classes of the preorder of structures)

(Similarly, we should find that we only care about the equivalence class [singular] of isomorphic realities [plural]?)

---

* this is not to say that I'm not interested; see https://news.ycombinator.com/item?id=39886966

Assuming a God who creates Creations, do we wind up with a best possible (principal) Creation? all possible (perhaps trivially so) Creations? if there is a set of Creations created, are they directed? Theologically, what happens if we have an infinite number of finite Creations, each an appropriate approximation, and we pass to their supremum?


> Could you please briefly explain the indispensibility argument?

An extremely rough caricature:

- Scientific realists claim that we should believe the entities that are indispensable to our best scientific theories (electrons, for instance) exist.

- Mathematical entities are indispensable to our best scientific theories

- So scientific realists should be realists about mathematical entities.


Thanks. I guess I'm an instrumentalist then (due to having a strong pragmatist streak?) because I'm unconvinced that "existence" in this caricature even denotes.


Is there any mathematics field that views mathematical objects as patterns of multiple numbers and considers order as a fundamental property? ex, 4 isnt just "4" but is either 1,1,1,1 or 3,1 or 1,3 or 2(a), 2(b) or 2(b), 2(a)? These all represent real life abstractions and I feel a lot of detail is lost when we only think about the structure of an object as its total count, and not its composition/order.


In some sense, basically every field of math makes this kind of distinction. e.g. it's acknowledged that for a vector space V, the tensor products V⊗(V⊗V) and (V⊗V)⊗V are not equal, but they are equivalent in the sense of e.g. a linear isomorphism, or maybe in the sense of an algebra isomorphism or something, and mathematicians do have machinery to track in what sense they consider two non-equal things to be equivalent, and they take the time to prove that operations they want to define on equivalent but non-equal things give equivalent results. Category theory is partly concerned with this kind of machinery.

So they do care about this, but then invented ideas like equivalence relations and quotients so that they can not have to constantly care about it.

With programming languages or a proof assistant like lean, by default non-equal things are not the same, and you need to do some bookkeeping to carry around proofs that things are equivalent in the way you need them to be. The tricky thing is actually making it ergonomic to be able to think of (1+1)+1 and 1+(1+1) as "the same" without constantly needing to think about it. It's hard to do math in most programming languages partly because they lack the machinery necessary to have contextual notions of equivalence (but also because they lack dependent types).

In some fields, that extra bookkeeping is really important. E.g. in geometry you might be able to choose coordinate systems at each point of some space, but perhaps not in a way that is globally defined and consistent with the types of equivalence that you need, so you need to track everything through local coordinate changes as you move around.

Or a simpler example is a vector space and its dual: they're isomorphic, so in some sense equivalent, but not canonically so, so you need to track the equivalence you're using, and maps on one are "mirrored" when applied to the other. More generally, any finite dimensional vector space is characterized (in the sense of equivalence up to linear isomorphism) by a single number (its dimension), so in some sense it "is" that number, but we also think of them as each being different, depending on context. So f.d. vector spaces can in some sense be thought of as "numbers that remember which vector space they are".


Numbers have properties beyond their order, right, e.g. factorization.

Maybe the field you are looking for is Number theory? Or maybe algebra and group theory?

But I don't quite understand your sentence:

> [...] I feel a lot of detail is lost when we only think about the structure of an object as its total count, and not its composition/order.

Maths is concerned a lot with how numbers can be constructed or composed from other numbers, ir other mathematical structures.

Still, there is a difference between the abstract notion of 4 and objects where we assign some measurable quantity (e g. counting similar objects, let's say turtles, and saying in total it's "4 turtles").

Numbers are not concerned with counting alone, but that's the easiest way to construct numbers.


What I mean is that there are multiple structures that are equivalently "4". 1,3 or 2,2, etc. 1 AND 3 depicts two distinct ovjects (or one object split into 2 parts) and is fundamentally a different idea than the structure represented by 4. However in all the math Ive seen this are simply reduced and we just care about the end result.

I am saying this equivalence isnt a fundamental property, but one merely useful toward a purpose. I am interested in mathemathics where one can reason about and do operations on ordered sets of discrete numbers or booleans.

I suppose matrices, boolean algebra or category theory is the closest to what I am thinking of, but I need to learn more about that.

I am aware of vector spaces but keep in my mind by 1,1 or 1,3 I am not talking about points in a multidimensional coordinate space.


So you are thinking of the set of all integer pairs whose sum is equal to an integer n?

in other words

  a + b = n
?

The size and structure of this set (or these sets, or more precisely, the function A(n) that returns all such 2-element sets of integers for a given n) is trivial and well understood.

Even more precisely, all naturals, but I always get these terms mixed up in English. Including negative numbers just leads to infinitely many boring solutions of course.

I feel I may be missing something in your line of thought.

What about factorization? Do you find that interesting?

Or maybe the set of all sets of size 1..n whose members are integers and sum up to some other integer n?

These are all trivial examples of course.

But I feel you'll find plenty of material to study in maths if you get to be able to articulate the properties of numbers that interest you.

> What I mean is that there are multiple structures that are equivalently "4". 1,3 or 2,2, etc. 1 AND 3 depicts two distinct ovjects (or one object split into 2 parts) and is fundamentally a different idea than the structure represented by 4. However in all the math Ive seen this are simply reduced and we just care about the end result.

The first sentence of my answer describes the possible ways to construct integers by adding pairs of integers as you describe. I'm talking about sets here (instead of vectors) because addition is commutative and the ordering is irrelevant.

It might be you are interested more in ontology than in maths?

But I'm not very proficient in maths.

Very complex structures can emerge from simple structures.

Consider Fermat's theorem, which also deals with summarization of integers (although the multiplication/powers make this not as trivial as what I understood from your examples)


This is something I've been thinking about myself alone for a while so I don't have the language to properly communicate it with the common mathematical or philosophical language, but I appreciate your thoughts.

Effectively, I am interested in a mathematical representation of any conceivable structure in space-time, with operations that allow for transformations of those structures in space-time. Imagine for example a screen of a 1D grid of binary objects, where W is white/on and B is black/off.

I can have WWWB, which I can think of (by mentally breaking it down) as WWW,B or as WW,WB, or W,WWB, etc. I can also think of a structure like X,X where X is any other structure (effectively, encapsulation, and recursion). I can do operations like put something as X or remove something as X. I can take X,X,X and combine it with WWW to get WWW,WWW,WWW. Moreover, I am interested in applying these operations through time, so for example I can do X,Y,->X,X,Y and apply this abstract transformation to WWW,BBB to get WWW,WWW,BBB. So first and fore most I am interested in a data structure that can handle this information (having spatial relations, where any single node can be the sum of any other node. Like a matrix where each point can be a matrix, or better yet a 2D linked list, where each node can be a 2D linked list).

This is my interest in thinking about the composition of numbers. I am more interested in math that focuses on properties of the compositions, their inverses, operations defined on them, their relation to the whole number, etc. The number 4 in my line of thinking is simple XXXX, but this is far too limiting for what I want to do.

Its a little more interesting with 2D spaces, because my structure becomes something like X(right)X(right)X and I can apply a X(UP)Y to get a structure like.

YYY

XXX

I've been playing around with this idea and I can take some simple patterns and create much more complex 2D images. I can create two squares for example, one to the right of the other, and apply a pattern like XY->YX to switch them.

Whats interesting also, is that the square in my system of made of simpler constructions that effectively define the idea of a square (X1(up)X2(right)X3(down)X4(left)X1, which can be broken down to its simpler components of being X1(up)X2, X1(down)X2, etc which gets us very close to the human level notion of a square being "point going up x amount then right x amount then... or "two sets of (two points in direction A with a corresponding two points in the direction A(inverse) )). I want to define a single formal system by which with simple rules, I can manipulate structures in an analogous manner to human thought.

I find it really interesting but I am kind of stuck on where to go with this.


Aren't you basically talking about lisp? You're suggesting two core primitives, an atom and a pair constructor, and you're defining an equivalence across pairs, eg. WWWB modelled as (cons W (cons W (cons W B))) = (cons (cons W W) (cons WB)) = (cons (cons (cons W W) W) B).

I assume you can do this with set theory and a suitable definition of equality.


I would never ask this in a trolling way, but I'm a bit floored (ha), so I'm asking it seriously:

> Effectively, I am interested in a mathematical representation of any conceivable structure in space-time, with operations that allow for transformations of those structures in space-time.

Are you serious about that?

If so, this is not really about maths. It's the classical "theory of everything" question in physics.

But you can't say anything meaningful about space and time using only maths. It makes no sense.


Haha yeah I am very serious. I spent a ridiculous amount of time the last 3-4 years thinking about this. It might all be junk but I find it very interesting.

>But you can't say anything meaningful about space and time using only maths. It makes no sense.

Why do you think that? I am talking about entirely abstract mental conceptions. I am not saying anything about how real physical processes work. Theres several mathemathic formulations of space-time, so I am confused as to what your bewilderment is.

Also keep in mind this is not too different from the Turing machine, which is a row or a multidimensional row of squares, with operations that allow for manipulating any one starting configuration to another.


> > But you can't say anything meaningful about space and time using only maths. It makes no sense.

> Why do you think that? I am talking about entirely abstract mental conceptions. I am not saying anything about how real physical processes work.

When considering if something is "real", maths and physics are OTOH deeply related (physics desribes the world using maths) and OTOH fundamentally different.

Maths builds systems of logical deduction based on self-evident axioms. "Pure Logic", ideally, if you will. The point is if the deduction is sound and how far logic can take you regarding abstract structure.

The simplest example is indeed the natural numbers, defined by the unit 1 and the rule that every natural number has a successor.

Physics, like other sciences, uses maths as a tool, for example statistics.

Physics is special in a way, because it applies stricter rules to what is a sounds and falsifiable theory.

But the basics are the same in every scientific discipline excluding Maths and/or Philosophy: a theory is only worth considering if it allows itself to be proven wrong. And the benefits of a theory or a theoretical model of physical reality lie in "predictions". A theory that does not make any prediction is not science, it's not a theory either. It's non-falsifiable; at best called metaphysics or religion.


Hmm, I am not really sure how any of that relates. I am not proposing a model that explains all of physical reality, just one that can formally represent mental observations. Different mathematics aim to do this in different ways.


Well, there's the fundamental theorem of arithmetic, that there is exactly one composition of prime factors for every natural number (excluding ordering). https://wikipedia.org/wiki/Fundamental_theorem_of_arithmetic


That’s how integers are defined in set theory.


OK, the confusing thing is that Dedekind proved that a two second order models of arithmetic are equivalent/unique but Godel essentially proved that there are an infinity of first order models of arithmetic. But it seems logical that a second order model of arithmetic would contain a first model and that you couldn't say "which" model contained.

I probably phrased that wrong but I think the question is clear


The distinction between "first order" and "second order" is in the first instance a distinction at the level of formal languages. A second order language has more complicated syntax and semantics: it allows variables in predicate position, which (in the standard semantics) take values from the entire powerset of the underlying domain.

This makes second order languages, including the language of arithmetic, much more expressive: they can distinguish models that first order languages can't. Those infinitely many non-isomorphic models of arithmetic expressed in a first order language can be distinguished, and excluded, as models of arithmetic expressed in a second order language. That's why second order arithmetic is categorical: all of its models are isomorphic.

Yes, a model of second order arithmetic contains a model of first order arithmetic, but within the second order language, you can say "which model it is" (up to isomorphism). It's only if you restrict yourself to a first order language that you can no longer say anything which will be true in that model, but false in any non-isomorphic one.


I think we need a philosophy of the philosophy of mathematics to build an understanding around this...


see Pierre Bordintello, "La Démonstration" (1984):

> Mathematics entails, and it entails the entailer.


I mean you cannot quote him without also quoting Robert Feuerstern's 1985 essay "Logique de la Tangence" in which he famously wrote :

"Pierre Bordintello is a fucking moron."


Luckily the indeterminate article "a" means "a fucking moron" is unambiguously a denoting phrase*.

For less fortunate wordings, allow me to present part of the transcript of Mouse v. Mouse (122 Mag.Kin.3d 871)

— Plaintiff, Ms Mouse, asks the Court to grant her a divorce on the basis of the insanity of her ... yes, Ms Owl?

— Begging your pardon, Your Honor, but the Court may have misapprehended the nature of this case. My client, Minnie, was very clear in her sworn statement. If it please the Court, please refer to that document on ... page 3, line 20. When asked what was the nature of the impediment to continuation of their marriage, she said of her husband, Mickey, not "he's insane", but —and I quote– "he's fucking Goofy".

* https://www.uvm.edu/%7Elderosse/courses/lang/Russell(1905).p...


Urgh another rabbit hole. I should thank you but I'm not sure I will :)


Unfortunately this rabbit hole's reality is in the equivalence class of the holes belonging to Hazel and Fiver.


Haven't read that for years!


What does Watership down have to do with this?


For a Platonist, "La Démonstration", like El-ahrairah, exists. (I prefer Borges style to AMS style citations)


Sorry, but if there's one thing the last forty years taught us, it's the fact that Bordintillism is a collective failure.


https://ncatlab.org/nlab/show/structuralism

>In the humanities

>In the 20th century, structuralism in the humanities is associated with Emile Durkheim and Georg Simmel in sociology, Ferdinand de Saussure (and later Roman Jakobson) in linguistics, and Claude Lévi-Strauss in anthropology.

Ferdinand de Saussure, Écrits de linguistique générale:

>The notion of identity will be, in all these orders, the necessary basis, the one that serves as an absolute basis: it is only through it and in relation to it that we can then determine the entities of each order, the primary terms that the linguist can legitimately believe to have before them.

>(Vocal Order) Flow of ideas: Everything that is declared identical by form, in opposition to what is not identical, is a finite term, which is not yet defined and can be arbitrary but represents for the first time a knowable object, while the observation of specific vocal facts outside the consideration of identity represents no object. A certain vocal being is thus constituted and recognized in the name of an identity that we establish, and then thousands of others are obtained using the same principle, we can begin to classify these identity patterns of all sorts that we take, and are obliged to take, for the primary and specific and concrete facts, although they are each in their infinite diversity only the result of a vast prior operation of generalization.

>Couldn't we limit ourselves to implying this great fundamental operation? Isn't it obvious from the outset that as soon as we talk about a group, for example, we mean the generality of cases where a group exists, so there is little subtle interest in recalling that this entity is fundamentally and primarily based on an identity?

>We will immediately see that it is not allowed to substitute abstract entities for the fact of the identity of certain concrete facts with impunity because we will deal with other abstract entities, and the only pole in the middle of this will be identity or non-identity.


The basic problem with computation is that identity among the inputs is not necessarily identity among the outputs, which explains why we often bring in bigger guns than Boolean logic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: