Strong Typing?
Nov. 17th, 2005 08:34 amGeek question:
What is the feature of a language that you consider "strong typing"? How does strong typing manifest itself (or not) in your favourite language or two?
Geek question:
What is the feature of a language that you consider "strong typing"? How does strong typing manifest itself (or not) in your favourite language or two?
(no subject)
Date: 2005-11-17 01:48 pm (UTC)(no subject)
Date: 2005-11-17 01:52 pm (UTC)(no subject)
Date: 2005-11-17 02:22 pm (UTC)(no subject)
Date: 2005-11-17 03:17 pm (UTC)(no subject)
Date: 2005-11-17 02:15 pm (UTC)(no subject)
Date: 2005-11-17 02:18 pm (UTC)(no subject)
Date: 2005-11-17 02:20 pm (UTC)About to start using Python (well, I've been about to for a year or three...), but back in Perl years I wasn't too happy with results in large projects.
(no subject)
Date: 2005-11-17 02:23 pm (UTC)ASP, which i spend more time with, seems more strongly typed than PHP - or even than ColdFusion, my other main scripting language - and requires that i pay much more attention to how variables are understood as i pass them around... frequently requiring that i convert types (character/integer conversions are quite common when processing HTTP requests against SQL-retrieved data) to get things to behave the way i expect them to.
that's why i so much prefer PHP to ASP these days, though i actually do so little of it. it feels to me like a language that was written by people that expected to use it.
of course, i'm talking about things a few levels down from where you usually work, i think. but the prejudices i reveal go back a long way... for very much the same reason, i always preferred to C to Pascal, back in the very early days.
(no subject)
Date: 2005-12-03 08:42 pm (UTC)And ASP does not have to be strongly typed. VB, which it was based on, would always let you just "Dim X" and X would be of type Variant. That's much less strongly typed than, say, Java.
(no subject)
Date: 2005-11-17 03:16 pm (UTC)foo="a"
foo=foo+1
print foo
Strong typing will cause that code to fail at the second line, because you can't do math on strings. To do this same thing in a strongly-typed language, you need to jump through a lot of hoops; either with a second variable (extract the ascii value of foo into bar, add 1 to it, push the result back into a string and set foo to that) or with "casting" (a bit of rigmarole where you tell the compiler that THIS ONE TIME, I REALLY MEAN to treat this string as a number).
Strong typing is a pain in the ass most of the time; I don't make the sorts of errors it is designed to catch. I make other errors.
(no subject)
Date: 2005-11-17 04:16 pm (UTC)foo="a"
foo=foo+1
print foo
Or possibly "a1", depending. That's what makes weak typing so interesting!
(no subject)
Date: 2005-11-17 04:30 pm (UTC)(no subject)
Date: 2005-11-17 04:46 pm (UTC)String foo = "a";
System.out.println(foo+1);
generates a1.
In Perl, the output from polyfrog's script is 1.
(no subject)
Date: 2005-11-17 04:51 pm (UTC)or...
um.
(no subject)
Date: 2005-11-17 05:28 pm (UTC)(no subject)
Date: 2005-11-17 05:13 pm (UTC)(no subject)
Date: 2005-11-17 05:30 pm (UTC)well, don't that just beat all...
(no subject)
Date: 2005-11-17 07:44 pm (UTC)It should either fail or do what I said (increment the character value by one). Doing what you're saying, it has to be implicitly casting my 1 into "1" and then concatenating. Which in turn implies that there's some typing happening in the background.
On the other hand, your example and mine are not the same: You are displaying the result of foo+1. I am incrementing foo and then displaying. At the end of yours, foo is still "a", at the end of mine its...well in any weakly-typed language I've used, its "b".
(no subject)
Date: 2005-11-17 08:42 pm (UTC)$ cat test.pl
$foo="a";
$foo=++$foo;
print $foo;
print "\n";
$ perl test.pl
b
But, that reveals a parallel ambiguity in expressing "incremement" versus "add 1" in most languages. Perl (and other languages borrowing heavily from C) distinguish between them; other languages don't, and some use a partial typing to "do what I mean". Javascript doesn't like incrementing a string at all,
javascript: var foo = "a"; document.write(++foo);
NaN
which kind of indicates that it's doing some DWIM with regard to the "+" operator.
At the opposite end of things, we've got bourne shell scripting where EVERYTHING is a string operation, unless specified otherwise:
$ foo=1+1
$ echo $foo
1+1
$
If you specify the operation to be a numerical calculation,
$ foo=$((1+1))
$ echo $foo
2
$ foo=$((a+1))
$ echo $foo
1
$ echo $a
$ a=a
$ echo $a
a
$ foo=$((a+1))
/bin/ksh: a: expression recurses on parameter `a'
$
(no subject)
Date: 2005-11-17 08:44 pm (UTC)$ foo=$(($a+1))
/bin/ksh: a: expression recurses on parameter `a'
$
(no subject)
Date: 2005-11-17 07:49 pm (UTC)(no subject)
Date: 2005-11-17 07:59 pm (UTC)(no subject)
Date: 2005-11-17 08:12 pm (UTC)(no subject)
Date: 2005-11-17 08:19 pm (UTC)Either I'm misremembering the multiple layers of casting as in your example (it was 25 years ago, after all) or maybe I was doing peek-poke things to get around it...or Commodore BASIC was different.
All are equally likely.
I know that TUTOR (the language I worked in professionally in the 80s) and original K&R C both do it my way. C especially I remember transitioning from K&R to ANSI C and running up against the typing restrictions.
(no subject)
Date: 2005-12-03 08:47 pm (UTC)If your character is represented as a char/byte, then yes, incrementing it shifts the letter up. But if it's a string then it depends on how "+" is interpreted: either it will concatenate a "1", or, horrors, it will increment your pointer and...
Some BASICs let you work with character codes. Commodore BASIC had tons of poke/peek/call things that were basically invoking magic machine code, so you would probably have been passing char/byte values around.
(no subject)
Date: 2005-11-17 03:27 pm (UTC)And, of course, runtime errors can be monitored for by the program itself on a statement-error-checking basis and on an exception monitoring basis. Frequently, due to the strict checking that the OS does over data elsewhere, it's often best to *not* do those things for data that's not possible to handle *completely* within the program. In a lot of cases, handling an error of a typing sort isn't something that one really wants this part of an application to do; a typing error is a symptom of a different, and probably much deeper problem or potential problem, so unless a program is known to be processing "wild data" (not yet handled by the application), the programmer usually wants the error thrown instead of fixing the data on the fly.
(no subject)
Date: 2005-11-17 03:33 pm (UTC)I like polyfrog's example, though.
So, characteristics of 'strong typing':
You have to declare your variables before you can use them.
You can only do certain things with the variables.
You will spend hours and hours trying to figure out what the he** is going on with a rounding error because you have accidentally divided a real number by an integer at some point deep in the code and everything from there on is buggered.
The scripting language I use most frequently (Miva-Script, which is an offshoot of HTML-Script) has no typing whatsoever... you can play fast-and-loose with what you do with the variables. If you want to put any restrictions on them you have to specify that within the bit of code you are currently working on.
Pluses - If (variable_name) works just fine as a boolean test
Um... it appeals to my slap-happy nature.
Minuses - You can get different results on the same data if you aren't selectively keeping track of how you are currently using the variable. The output is unpredictable, because it has some history (if you last did a string operation and then switch to integers without telling the program, you will wind up with something other than what you expected.) That is, the language seems to have some typing implicit in it... it would probably be better if that were explicit.
Other possiblity - I'm not really a programmer and I'm making consistent mistakes with my variables.
(no subject)
Date: 2005-11-17 11:36 pm (UTC)>>> foo = "a"
>>> foo += 1
Traceback (most recent call last):
File "", line 1, in ?
TypeError: cannot concatenate 'str' and 'int' objects
I use python because I Am Not a Programmer and it's the only thing I can remember enough of when I go back to my hacky code six months later to not waste huge squads of time figuring out what I was doing.
I did program in C once, and didn't do too badly. And I even wrote some stuff in PostScript.
(no subject)
Date: 2005-11-18 05:12 am (UTC)For the record, the compiler would barf on your shoes if you tried to add "a" and 1.
(no subject)
Date: 2005-12-03 08:58 pm (UTC)(forgive the nested quotes; the <em> should disambiguate...)
Given a class Base, a derived class Derived, and a variable Var declared to be of type Base: if I store a reference to a Derived in Var, then the only methods and members accessible via "Var.whatever" will be those of Base, not Derived.
That's a very clinical answer, but generally, where you store a reference determines what you can do to it. (this also assumes that you cannot store references in invalid locations)
Have you ever run across gBeta (http://www.daimi.au.dk/~eernst/gbeta/)? It has very interesting notions of classes, objects, and inheritance. I think of it as almost the antithesis of strong typing because it allows you to morph just about any type into any other type (with probable loss of data, of course).