Jump to content
Science Forums

Algorithms beyond programming


Recommended Posts

It was fairly clear to me that the problem was a "counting issue." Counting issues often lead to claims like Disraeli's who said, "Lies, damned lies, and statistics". For instance, the literature about the number of neurons in our brains has claimed 500 million and up to 500 billion. The broad range was due to counting issues.

 

In counting, step 1 is to decide on what to count, and then step 2 is deciding how to count.

 

Maddog points well to this issue by describing how counting steps does not reflect cost. One solution is to count each step in a weighted manner. Another solution is consider all solutions with a constant ratio to each other as being equal. By this I mean a solution that takes 3n steps is the same as a solution involving 10n steps. But a solution requiring log(n) steps is not equivalent.

Link to comment
Share on other sites

  • Replies 82
  • Created
  • Last Reply

Top Posters In This Topic

The word transducer has a meaning which is not archaic. It is a term describing some of the mathematical machines that can be used to describe computations.

Or, in the immortal words of Richard O’Brien’s Planet, Schmanet, Janet:

Tha’ tranducer

Will seduce ‘ya

It’s something you’ll get used to

Language is mutable. Here, technically, it should have been “transmuter”, but that wouldn’t rhyme with “seduce ya”.

 

Even in computer science (which these lyrics are unambiguously not), one shouldn’t discount the importance of ring and rhyme.

Link to comment
Share on other sites

Let’s define a sub-collection of the collection of all programs, algorithms, as programs that usually end, and produce useful data.

Cool, but I'll still argue this is insufficient, since it still allows for my stop-light "algorithm" posted earlier:

  • It's indubitably a program based on endless extant implementations.
  • It produces "useful data" insofar as it prevents crashes between moving hunks of steel
  • It stops occasionally when the power goes out, or the highway patrol is handling a special event, or it's taken out for service, but even then, the restriction is limited to "usually"...

 

So what are we gonna do about that "usually"? Is it a requirement or not as we define our "consensus terms?"

 

I have opinions of my own -- strong opinions -- but I don't always agree with them, :phones:

Buffy

Link to comment
Share on other sites

Buffy I would argue no because your point 3 is that the finite nature of the program is dependent on events not a part of the program. I would also propose that our point 2 is weak since the usefulness is not a part of the output of the program, but rather a secondary consequence of the program. The actual output of the program is signaling in the form of lights of different colors.

 

Your program would be similar to generating 30 Rs, then 4Ys, then 30Gs, and then repeating this process endlessly. A Turing machine could be made to perform this action (although not behaving as a transducer). The output of the Turing machine never ends. It isn't an algorithm. It is a program.

 

The generation of one cycle, 30 Rs 4 Ys and 30 Gs can be an algorithm.

 

I mean the main reason for these definitions is to be able to evaluate a computation. Is it correct? Is it as fast as it is possible to compute the value? Is there a way that uses less memory?

Link to comment
Share on other sites

By your definition then, 90-odd percent of all programs in existence are not algorithms.

 

That's fine, but riddle me this: what is the use of such an exclusive definition?

 

In other words, "what problem are you trying to solve?"

 

An undefined problem has an infinite number of solutions, :phones:

Buffy

Link to comment
Share on other sites

Perhaps my advice has been carried all to far! The essential point was to exercise a smidgen of elasticity, without each one clinging to their own convention and maybe not even stating which it is. ;) So, it seems dicussion can now proceed. :phones:

 

By your definition then, 90-odd percent of all programs in existence are not algorithms.
Here's another reason why rather few programs are algorithms (perhaps much less even than your complaint):

 

We've been neglecting the further fact that an algorithm is understood as acting on an initial input; IOW it's all there, right from the start. Your program can use argv and argc and yet still be an algorithm. It can even be designed to read in from a file which must already be sitting there waiting. But make it interactive and it isn't really an algorithm!

 

Nitpicking: You'll say that one can write a program that reads stdin but then run it with < infile.txt in the command line or put it in a pipeline! Now I don't think we should be talking so much about coding issues (except illustratively) and the likes, hence I also disagree about the language dependence of the number of steps being a problem; a "step" can even be a whole sub-algorithm! It remains that a program which is basically conceived to just sit there till the next user input, HTTP request or RFID event, whatever the heck, can scarcely be called an algorithm. It may however be executing algorithms for each of those, but call it a GUI, call it a service, call it a 'bot or call it a spade.

 

What's that ol' Wirth said? Algorithms + datastructures = programs...

Link to comment
Share on other sites

By your definition then, 90-odd percent of all programs in existence are not algorithms.

 

That's fine, but riddle me this: what is the use of such an exclusive definition?

 

In other words, "what problem are you trying to solve?"

 

At the end of the post before yours I stated:

 

I mean the main reason for these definitions is to be able to evaluate a computation. Is it correct? Is it as fast as it is possible to compute the value? Is there a way that uses less memory?

 

I gave three reasons for separating out algorithms from programs:

1. correctness

2. speed (measure of time cost)

3. resource usage (memory for example)

 

It has been shown that finite state transducers cannot compute everything that a Turing machine can. It has been shown that a push down transducer (transducer based on a PDA) is equivalent to a Turing machine. The definitions restrict the possibilities, but create a series of analyzable mathematical machines.

 

Qfwfq. I don't think you are nitpicking, but rather pointing out how algorithms are used to construct large entities. I like the way you constructed real life examples of how costs vary across steps, and that steps can be nontrivial.

Link to comment
Share on other sites

A transducer is a simple device that has an input and converts it to a different form in the output. So a wind mill is a transducer converting the wind into a water pump. A better example is an electric motor converting electricity into motion.

I have even a better example: We as humans beings, a member of the animal kingdom

can be transducers as we consume foodstuffs and produce excrement. It is just a different

form.

In CS a transducer converts an input into a different form. The mathematical representation is a "recognizer which emits an output string during each move made. The finite transducer is obtained by taking a finite automaton and permitting the machine to emit a string of output symbols on each move."

This quote is from p223 of Aho and Ullman.

You missed my point. It may be imho that likely in 200 years or so, the use of the

word transducer used as you describe will be found out to be totally useless. In that

vein, I will start now by NEVER using it in that way. Excuse me. I will find better ways

like "parse" -- I will find work satisfactorily as it has for the last 30 or so years. :doh:

The word transducer has a meaning which is not archaic. It is a term describing some of the mathematical machines that can be used to describe computations.

I never said archaic, unless you consider the 50's "archaic" -- why that the time period

of the origin of computer languages like FORTRAN, et al. :D

 

maddog

Link to comment
Share on other sites

Cool, but I'll still argue this is insufficient, since it still allows for my stop-light "algorithm" posted earlier:

  • It's indubitably a program based on endless extant implementations.
  • It produces "useful data" insofar as it prevents crashes between moving hunks of steel
  • It stops occasionally when the power goes out, or the highway patrol is handling a special event, or it's taken out for service, but even then, the restriction is limited to "usually"...

 

So what are we gonna do about that "usually"? Is it a requirement or not as we define our "consensus terms?"

I bow to your greater wisdom. The Stop Light is an excellent example. Yet even the

algorithm within (it does have one) is quite simple.

 

(For each side - proportion so as not to have accidents)

1. Go = Green.

2. Caution = Yellow.

3. Stop = Red.

4. Return to step 1.

 

The "rules" of when to go to the next state can be tuned and are "usually" a timing sequence.

 

I have opinions of my own -- strong opinions -- but I don't always agree with them, :doh:

I love that!!! :D

 

maddog

Link to comment
Share on other sites

You missed my point. It may be imho that likely in 200 years or so, the use of the

word transducer used as you describe will be found out to be totally useless. In that

vein, I will start now by NEVER using it in that way. Excuse me. I will find better ways

like "parse" -- I will find work satisfactorily as it has for the last 30 or so years.

 

Qfwfq, I suggest it is time to lock this thread. Apparently, it is pointless to try and introduce someone to topics they were not aware of.

 

BTW, I have been doing this for 40 years. I've been aware of the proper meaning of parse for 40 years.

Link to comment
Share on other sites

Qfwfq. I don't think you are nitpicking, but rather pointing out how algorithms are used to construct large entities.
I addressed the hypothetical nitpicking myself. :doh:

 

In that vein, I will start now by NEVER using it in that way. Excuse me. I will find better ways like "parse" -- I will find work satisfactorily as it has for the last 30 or so years.
Seems you've been misunderstanding: he isn't using it as a synonym for parse, but actually as a way of indicating something that does more than parsing (not just reporting OK or bad syntax), such as a compiler.
Link to comment
Share on other sites

Seems you've been misunderstanding: he isn't using it as a synonym for parse, but actually as a way of indicating something that does more than parsing (not just reporting OK or bad syntax), such as a compiler.

I commend you Qfwfq for capturing that nuance. Yes, am using in the context of extraction of data from an input stream (be it keyboard, file, camera, sensor, etc).

And interpretation of the same.

 

Stereologist, I must admit, I was not formally trained in Computer Science (though I did

take it as a minor), so I may not have taken "all" the courses as required. My major

originally was Astrophysics and transferring to Physics when I graduated (I did take all the

required Astrophysics courses though). I learned the bulk of my experience in CS on-the-job

in Aerospace/Defense.

 

40 years would put you at just before we landed on the moon, the HP-35C had not yet

come out yet and no actual implication of the existence of Black Holes had not yet been

discovered. The language C did not even exist yet (except as research at ATT Bell Labs). The GoF had not even published their book on Design Patterns yet. So in what

context was "transducer" ? I guess in AI, Snobol was available then, I am not sure if Lisp was.

 

So please close this thread. We have already beaten this horse to death a few times

already. :doh:

 

maddog

Link to comment
Share on other sites

The moon landing was 1969, 40 years ago. Fortran, basic, Cobol, RPG, assembler existed. The IBM 1130 was used to simulate the heating of the LEM. Forbin the Colossus project used IBM 620s. C was not around but B and BCPL were. APL, BAL were around. IBM 360 was 1963. I'll bet the early incarnations of Forth were available in 1969.

 

What passes as computer speak is nothing short of ebonics or should I say e-bonics.

Link to comment
Share on other sites

OK guys let's try to get over it. I myself would have said that assemblers and compilers could be called translators (yup, just like from German to Russian) but if the term transducer has long been used, lets get on with the topic:

I recently learned that algorithm is a word being used without a definite defintion, as seems to be the case of a lot of terms I have been analyzing.

 

Anyway, I am wondering about this passage that algorithms and it seemingly exercised in fields beyond computer programming.

 

Most algorithms are intended to be implemented as computer programs. However, algorithms are also implemented by other means, such as in a biological neural network (for example, the human brain implementing arithmetic or an insect looking for food), in an electrical circuit, or in a mechanical device.

 

There is no reference, anybody able to direct me toward any?

May we then say:

 

  • An algorithm is a procedure. One could say program although it is not necessarily a computer program. It may be defined by constructs such as a flowchart or in a language having constructs such as described in the Böhm-Jacopini theorem. This makes it a deterministic procedure even though it might incorporate making a random choice at some steps (this doesn't make the procedure itself randomly defined).
  • An algorithm has the purpose of finding (one of the) solutions of a given problem problem, when executed for a given case. If it outputs a solution, it must be certain that this is a valid one. If necessary, the problem must specify a range for quantities in order to make it feasible to reach solutions.
  • An algorithm can be performed on an input: a set of data which amounts to specifying a case or instance of the problem. It must also be designed to terminate within finite "time", if necessary outputting that no valid solution has been reached.
  • The term sometimes gets used in ways that slacken the above two points.

Link to comment
Share on other sites

Qfwfq,

 

Your four bullets, I feel address a definition of Algorithm well enough with the following

proviso:

 

 

  • That said Algorithm is already considered a valid one.

  • Restricting Algorithm to finite states is fine enough. Consider the calculation of Primes. The Algorithm is only calculating one Prime at at time. Do so on each one is definitely a finite set of steps no matter how big the Prime is.

 

 

maddog

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

×
×
  • Create New...