Programming

on fundamental matters such as existence, knowledge, values, reason, mind and ethics.

Moderators: kiore, The_Metatron, Blip

Re: Programming

#121  Postby mizvekov » Mar 21, 2012 4:18 pm

VazScep wrote:@[color=#CC0000][b]mizvekov[/b][/color] Do you have any thoughts about "concepts" in C++?

Well I mentioned them briefly a few posts back. :grin:

They were supposed to have gotten in with the latest standard, but they were scrapped because of one issue that popped up, and they didn't want to delay the whole shebang just because of that. Now, that issue has been solved since, and I expect them to be up again for the next standard, possibly to be released in five years.

Now, their purpose is to specify constraints on the types templates can accept. Currently, the constraints are specified by the code itself, and failure to match happens by compilation error at the offending piece. This leads to the infamous longish (to put it mildly) compiler error messages when you pass a nonsensical type to a template. :yuk:

Now concepts allow you to specify the constraints of a type beforehand. What kind of operations it supports, what functions can you call it with, which members it has, and others. So the next time you try to instantiate a template with an incompatible type, the error will be much more comprehensible, specific and hopefully smaller.

Now it also allows for some optimization opportunities, since you can also use concepts to specify specializations for templates.
For example, you could define a special implementation scheme of a generic class that makes use of some special feature of one of the possible type parameters, and also later define a more general implementation scheme which works with any types.

Now one feature that I find *very* cool about all of this is that the concept proposal also includes something else which I didn't mention, something called axioms.
You can use them to further specify some semantic properties of concepts. For example, you could use it to specify that some operation is commutative:
Code: Select all
axiom Commutativity(Op op, T x, T y)  {  op(x, y) == op(y, x);  }

And this allows for some kinds of optimizations which were just not allowed under any circumstances before, as per the standard.
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#122  Postby VazScep » Mar 22, 2012 10:44 am

mizvekov wrote:
VazScep wrote:@[color=#CC0000][b][color=#CC0000][b]mizvekov[/b][/color][/b][/color] Do you have any thoughts about "concepts" in C++?

Well I mentioned them briefly a few posts back. :grin:
Whoops! Sorry.

Now, their purpose is to specify constraints on the types templates can accept. Currently, the constraints are specified by the code itself, and failure to match happens by compilation error at the offending piece. This leads to the infamous longish (to put it mildly) compiler error messages when you pass a nonsensical type to a template. :yuk:

Now concepts allow you to specify the constraints of a type beforehand. What kind of operations it supports, what functions can you call it with, which members it has, and others. So the next time you try to instantiate a template with an incompatible type, the error will be much more comprehensible, specific and hopefully smaller.
Right. Sounds cool to me.

It's related to type-classes it seems, except that concepts can both be instantiated explicitly using concept maps, and they can be left implicit, which is a nice property.

There are interesting tradeoffs in the C++ approach to this stuff. Can concept requirements be inferred? So, if I write (I'm rusty, so I'm not sure if this is the right syntax):
Code: Select all
template<typename T> requires FooConcept
T foo() {
}

template<typename T>
T bar() {
   foo();
}
Can the FooConcept requirement on bar be inferred?

Anyway, the fact that C++ templates are required to be fully expanded gives you quite a bit of expressive power over Haskell here. Do you have any thoughts on this? Full template expansion, in the worst case, is going to lead to code bloat, and longer compilation times. But you gain because the compiler can make much more aggressive optimisations.

The tradeoff is still debated in the functional world. Ocaml doesn't fully expand calls to polymorphic functions at compile time, because they want to allow separate compilation of modules. However, the Standard ML compiler MLton does fully expand (it's called "monomorphisation"), and they seem to think that the increased build times and code size isn't too big a deal. Then there's F#, which has separate compilation at the level of bytecode, and then monomorphises in JIT compilation.

Now it also allows for some optimization opportunities, since you can also use concepts to specify specializations for templates.
For example, you could define a special implementation scheme of a generic class that makes use of some special feature of one of the possible type parameters, and also later define a more general implementation scheme which works with any types.

Now one feature that I find *very* cool about all of this is that the concept proposal also includes something else which I didn't mention, something called axioms.
You can use them to further specify some semantic properties of concepts. For example, you could use it to specify that some operation is commutative:
Code: Select all
axiom Commutativity(Op op, T x, T y)  {  op(x, y) == op(y, x);  }

And this allows for some kinds of optimizations which were just not allowed under any circumstances before, as per the standard.
That's cool. I take it that the compiler doesn't try to prove this sort of thing? It's just there as a context sensitive rewrite rule?

This ability to state formal properties of one's code is, of course, something I absolutely love. You should be able to carve out whole hierarchies of algebras based on these concepts. And maybe we can start making these axioms more expressive and shoving in some formal verification and theorem proving :whistle:
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#123  Postby VazScep » Mar 22, 2012 12:45 pm

Thinking about this again, I think I might have understood how concept checking works.

I guess the idea is this: when I come to instantiate bar with something that is not a FooConcept, it will fail during instantiation of foo and report this. If I put the requirement on the bar template, then it will fail during instantiation of bar with the same error?
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#124  Postby newolder » Mar 22, 2012 2:41 pm

VazScep wrote:
mizvekov wrote:That premise is worded quite strongly, but I can't really disagree with a weaker version of it.
It's quite difficult to match all those pesky parentheses to their closing ones :smile:
I just never really bought that. For starters, no serious Lisp user would dream of Lisp hacking without a facility to highlight matching parentheses. (go to Options->Paren Match Highlighting in Emacs). And with decent syntax highlighting and appropriate indentation, I never found readability an issue in Lisp. That's not cheating: every language cares about appropriate indentation and syntax highlighting, and I always have paren-match highlighting turned on when hacking in ML or Haskell.

In fact, when I'm tallying my shopping lists or budgeting, I tend to use Lisp so that I can write in this style:

Code: Select all
(+ (* 2 37) ; beans
    400      ;detergent
    127      ; meat
    1000    ; extremely expensive cuisine that helps make this look less of a student's shopping list
)

...snip

hi VazScep, wat r teh magic numbers 2 37 400 127 1000 in ur code, plz ? :ask:
I am, somehow, less interested in the weight and convolutions of Einstein’s brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops. - Stephen J. Gould
User avatar
newolder
 
Name: Albert Ross
Posts: 7405
Age: 1
Male

Country: Feudal Estate number 9
Print view this post

Re: Programming

#125  Postby mizvekov » Mar 22, 2012 6:56 pm

VazScep wrote:Whoops! Sorry.
Nah, don't worry.

VazScep wrote:Right. Sounds cool to me.

It's related to type-classes it seems, except that concepts can both be instantiated explicitly using concept maps, and they can be left implicit, which is a nice property.

There are interesting tradeoffs in the C++ approach to this stuff. Can concept requirements be inferred? So, if I write (I'm rusty, so I'm not sure if this is the right syntax):
Code: Select all
template<typename T> requires FooConcept
T foo() {
}

template<typename T>
T bar() {
   foo();
}
Can the FooConcept requirement on bar be inferred?

Nope, it's just impossible to do. C++ templates are far too powerful, they are a turing complete sublanguage inside of c++ itself, and in fact there are whole libraries for metaprogramming built on top of them (boost's MPL for example).
Consider that you can do something like this:
Code: Select all
template<int N> struct factorial { enum { result = factorial<N - 1>::result * N }; };
template<> struct factorial<0> { enum { result = 1 }; };
int main() { std::cout << factorial<5>::result << std::endl; }

And this will calculate the factorial of 5 in compile time.
Now if you tried to write a code that analyzed a template and tried to derive the requirements on it's types, I think you would run into some variant of the halting problem :grin:

PS: The right syntax is
Code: Select all
template<typename T> requires FooConcept<T>

And by the way, don't worry, this stuff isn't even in the standard. There is currently no compiler supporting this. There was a patch for gcc a while ago that implemented it partially, but it's development was ceased when the standard committee dropped it.

PS2: The reverse is possible. If you declare a concept as 'auto', then the compiler can infer which types conform to that concept.
What can't be done is just the opposite, the compiler cannot infer what concepts a template type parameter needs to conform to.

VazScep wrote:Anyway, the fact that C++ templates are required to be fully expanded gives you quite a bit of expressive power over Haskell here. Do you have any thoughts on this? Full template expansion, in the worst case, is going to lead to code bloat, and longer compilation times. But you gain because the compiler can make much more aggressive optimisations.
Like I said above, templates are far too powerful. And using that power is not necessarily easy. Not only can (shitty) compilers crash if you recurse your templates too deep, but consider an hypothetical case where a compilation error happens at recursive depth 50, can you imagine the size of the error message you will get? :grin:
The template expansion doesn't necessarily give you code bloat though. Compilers will normally implement a final step of merging code of template instances which are similar, but it's just an heuristic anyway.

This excessive power of templates tends to be looked down by the designers of supposed c++ successors, like java, c# and D.
I know that D's templates are severely castrated, and you can't do with them the sort of stuff you can with the MPL.
They do this so that templates are more tractable, and it's easier to write code that analyzes their properties.

VazScep wrote:The tradeoff is still debated in the functional world. Ocaml doesn't fully expand calls to polymorphic functions at compile time, because they want to allow separate compilation of modules. However, the Standard ML compiler MLton does fully expand (it's called "monomorphisation"), and they seem to think that the increased build times and code size isn't too big a deal. Then there's F#, which has separate compilation at the level of bytecode, and then monomorphises in JIT compilation.

Well F# sounds cooler by the hour.
But anyway, c++ allows for separate compilation of 'modules' even where templates are concerned, but it requires manual intervention. Basically you can declare a template instantiation as extern, and then force it to be instantiated at any compilation unit you want.

VazScep wrote:
That's cool. I take it that the compiler doesn't try to prove this sort of thing? It's just there as a context sensitive rewrite rule?

Yeah that's probably right. Although I have never tried them myself yet. Would love to though.

VazScep wrote:
This ability to state formal properties of one's code is, of course, something I absolutely love. You should be able to carve out whole hierarchies of algebras based on these concepts. And maybe we can start making these axioms more expressive and shoving in some formal verification and theorem proving :whistle:

Hah hold your horses right there. We are just too many years away from being able to even try this :)

VazScep wrote:Thinking about this again, I think I might have understood how concept checking works.

I guess the idea is this: when I come to instantiate bar with something that is not a FooConcept, it will fail during instantiation of foo and report this. If I put the requirement on the bar template, then it will fail during instantiation of bar with the same error?

They will fail at both times, but not the same error. Like I said before, if there are no concepts involved, the error message will happen at the point of code that tries to do something illegal with the type in question. This can happen many layers deep in a more complex template, and will spew a large error message, which will also expose the user to implementation details.
Whereas if you use concepts, the error message will be something a lot simpler, maybe even a one-liner, informing that the type in question does not conform to the required concept.
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#126  Postby VazScep » Mar 22, 2012 9:23 pm

mizvekov wrote:Nope, it's just impossible to do. C++ templates are far too powerful, they are a turing complete sublanguage inside of c++ itself, and in fact there are whole libraries for metaprogramming built on top of them (boost's MPL for example).
Hmm...could you restrict the inference to the use of concepts though?

I've read (since my last post) that the way you would manually typecheck something like "bar" is to construct an archetype that minimally implements the concept. Hopefully, that could be done automatically, by automatically constructing the archetype. But I'm tempted to think this should just be an option for compiler writers.

And by the way, don't worry, this stuff isn't even in the standard. There is currently no compiler supporting this. There was a patch for gcc a while ago that implemented it partially, but it's development was ceased when the standard committee dropped it.
There's ConceptGCC! It sounds like it's seriously prototype at the moment, though.

PS2: The reverse is possible. If you declare a concept as 'auto', then the compiler can infer which types conform to that concept.
What can't be done is just the opposite, the compiler cannot infer what concepts a template type parameter needs to conform to.
As I understand, if you declare a concept as auto, then no inference actually happens. It just means you don't need a concept map, and so concept-satisfaction will happen at template-expansion time. Is that right?

I don't expect C++ to infer arbitrary concepts without explicit concept annotations, though, since presumably, there could be multiple concepts that a type could satisfy. You can have concepts that overlap right?

VazScep wrote:Like I said above, templates are far too powerful.
I'm not yet convinced :P Many of the GHC extensions give you a Turing complete type-theory too.

And using that power is not necessarily easy. Not only can (shitty) compilers crash if you recurse your templates too deep, but consider an hypothetical case where a compilation error happens at recursive depth 50, can you imagine the size of the error message you will get? :grin:

The template expansion doesn't necessarily give you code bloat though. Compilers will normally implement a final step of merging code of template instances which are similar, but it's just an heuristic anyway.
Ah, this is interesting. Can you point me to some documentation about this?

Well F# sounds cooler by the hour.
It doesn't have the Ocaml module system unfortunately. But it has lots of other cool stuff, such as type-inferred dimensional types, which are awesome. I've had the compiler reject my F# code, and point me to a logic error where I was trying to add a distance to a velocity because of a careless placement of brackets. Without the type-checking, I'm not sure how long I would have spent tracking the bug.

But anyway, c++ allows for separate compilation of 'modules' even where templates are concerned, but it requires manual intervention. Basically you can declare a template instantiation as extern, and then force it to be instantiated at any compilation unit you want.
Again, this is interesting. Do you know what the tradeoffs are? How does use of "extern" mesh with dynamically linked libraries? If there are issues, are they even that big a deal?

VazScep wrote:This ability to state formal properties of one's code is, of course, something I absolutely love. You should be able to carve out whole hierarchies of algebras based on these concepts. And maybe we can start making these axioms more expressive and shoving in some formal verification and theorem proving :whistle:

Hah hold your horses right there. We are just too many years away from being able to even try this :)
Hey, it's a significant advance. Axiomatic constraints are normally just given in documentation, which is not machine-readable. That's the most significant barrier to formal verification.
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#127  Postby mizvekov » Mar 23, 2012 3:28 am

VazScep wrote:Hmm...could you restrict the inference to the use of concepts though?

By the way, I have read your example again, and now I think I understand what you want to do.
I will try to rewrite it to something which would compile, and tell me if that's what you meant:
Code: Select all
template<typename T> requires FooConcept<T>
T foo() {
}

template<typename T>
T bar() {
   return foo<T>();
}


And even though it might seem simple here, I think there is no way to do this kind of stuff for all templates.
I think you are saying that from that example, it would seem obvious that bar would also require FooConcept<T>
But suppose foo also has a specialization for another type, like here:
Code: Select all
template<typename T> requires FooConcept<T>
T foo() {
}

template<>
char foo<char>() {
}

template<typename T>
T bar1() {
   return foo<T>();
}

template<typename T> requires FooConcept<T>
T bar2() {
   return foo<T>();
}

And now bar1 accepts T as char, but bar2 does not.

VazScep wrote:I've read (since my last post) that the way you would manually typecheck something like "bar" is to construct an archetype that minimally implements the concept. Hopefully, that could be done automatically, by automatically constructing the archetype. But I'm tempted to think this should just be an option for compiler writers.

But automatically build the archetypes from what information?
The body of the templates which uses the type? I don't think that could be workable as I said above. Besides the problems I mentioned for convoluted templates, I think there could be a lot of ambiguities there too. Consider the below code:
Code: Select all
template<typename T> void foo(T a) { bar(a, 2.0); }

Now, which of the below concepts would the compiler choose so that T would be required to satisfy:
Code: Select all
concept C1<typename T> { void bar(T, double); }
concept C2<typename T> { void bar(T, float); }
concept C3<typename T> { void bar(T, int); }
concept C4<typename T> { int bar(T, double); }
concept C5<typename T> {
   void bar(T, double);
   void bar(T, float);
   void bar(T, int);
   int bar(T, double);
}

And I could go on all day.

VazScep wrote:There's ConceptGCC! It sounds like it's seriously prototype at the moment, though.

Yay, thought that the patch to implement it was just discontinued, but they decided to just fork GCC instead!
So I suppose we will be able to try it out a lot sooner :)

VazScep wrote:As I understand, if you declare a concept as auto, then no inference actually happens. It just means you don't need a concept map, and so concept-satisfaction will happen at template-expansion time. Is that right?

Yes that's right, I think so.

VazScep wrote:I don't expect C++ to infer arbitrary concepts without explicit concept annotations, though, since presumably, there could be multiple concepts that a type could satisfy. You can have concepts that overlap right?

Yes ofcourse. You can also have a concept which enforces a restriction on any number of types. For example:
Code: Select all
concept C1<typename T, typename U> {
  void bar(T, U);
}

template<typename T, typename U> requires C1<T, U>
void f(T t, U u) {
  bar(t, u);
}


VazScep wrote:I'm not yet convinced :P Many of the GHC extensions give you a Turing complete type-theory too.
:grin:

VazScep wrote:
The template expansion doesn't necessarily give you code bloat though. Compilers will normally implement a final step of merging code of template instances which are similar, but it's just an heuristic anyway.
Ah, this is interesting. Can you point me to some documentation about this?

Well, that's hard to do, as this is something not mandated by the standard. This is usually one of the many tricks that compilers do, and I came to see them through experimentation and by talking to others and reading articles. I simply am not finding such documentation.
One thing that you can certainly expect of any compiler is that it will at least remove duplicated instantiations which happened in different compilation units. When you try to link them together to form an executable or whatever, the linker will discard duplicates and keep only one copy.
Another thing that some compilers implement is that if a function has exactly the same code in two different instantiations of the same template (as in different template parameters), then the duplicated code will be discarded, and only one will be used instead.

VazScep wrote:It doesn't have the Ocaml module system unfortunately. But it has lots of other cool stuff, such as type-inferred dimensional types, which are awesome. I've had the compiler reject my F# code, and point me to a logic error where I was trying to add a distance to a velocity because of a careless placement of brackets. Without the type-checking, I'm not sure how long I would have spent tracking the bug.

Cool :)
How does that work? do you have an example?

VazScep wrote:Again, this is interesting. Do you know what the tradeoffs are?

Well, by declaring an instance as extern, the compiler does not have to instantiate it again at every compilation unit, only to then have the linker discard the multiple copies. This results in huge compilation time savings for some cases.
One possible downside is that the compiler will have to rely on link time optimization in order to do such things as inlining.
VazScep wrote:How does use of "extern" mesh with dynamically linked libraries? If there are issues, are they even that big a deal?

Dll are not directly supported by neither the C nor the C++ standard, and how to implement them is very platform dependent.

It does not change anything anyway since there is no standard abi for c++ code, and so when you write dll's your only real option is to export functions as C (with extern "C"). You could do otherwise, but then if you tried to create an executable in visual studio which uses a dll created in g++, it would just not work, most likely it would segfault.

If you need this kind of functionality in c++ code, then you would probably need something like microsoft COM or CORBA.
I have never used those two, but I don't think it makes sense to say that you would need to export a generic class or generic function through them. I suppose that you would instead have a generic 'thing' and then use whatever mechanisms these systems provide to export an instance of that thing.
In that case, I also don't see how the 'extern' changes anything significantly.

VazScep wrote:Hey, it's a significant advance. Axiomatic constraints are normally just given in documentation, which is not machine-readable. That's the most significant barrier to formal verification.

I see :)
Well yes that's indeed very cool, but I just meant that I thought we wouldn't be seeing any compilers implementing it so soon.
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#128  Postby VazScep » Mar 24, 2012 12:42 pm

mizvekov wrote:And now bar1 accepts T as char, but bar2 does not.
Ack, yes, of course!

VazScep wrote:But automatically build the archetypes from what information?
The concept definition. But now I see that with template specialisation, this isn't going to be possible.

The body of the templates which uses the type? I don't think that could be workable as I said above. Besides the problems I mentioned for convoluted templates, I think there could be a lot of ambiguities there too. Consider the below code:
Code: Select all
template<typename T> void foo(T a) { bar(a, 2.0); }

Now, which of the below concepts would the compiler choose so that T would be required to satisfy:
Code: Select all
concept C1<typename T> { void bar(T, double); }
concept C2<typename T> { void bar(T, float); }
concept C3<typename T> { void bar(T, int); }
concept C4<typename T> { int bar(T, double); }
concept C5<typename T> {
   void bar(T, double);
   void bar(T, float);
   void bar(T, int);
   int bar(T, double);
}

And I could go on all day.
Yes, this is the same as in Standard ML/Ocaml. You can have overlapping signatures, so you can't generally infer which one you are interested in without a structural type-system such as Ocaml's objects and polymorphic variants. But they have a whole load of tradeoffs themselves, and besides, one of the reasons why you want the explicit annotation is for encapsulation/information-hiding.

VazScep wrote:Yes ofcourse. You can also have a concept which enforces a restriction on any number of types. For example:
Code: Select all
concept C1<typename T, typename U> {
  void bar(T, U);
}
I believe this is analogous to multiple parameter type-classes. There seem to be some interesting comparisons to be made between type-classes, SML style modules and structural types.

VazScep wrote:It doesn't have the Ocaml module system unfortunately. But it has lots of other cool stuff, such as type-inferred dimensional types, which are awesome. I've had the compiler reject my F# code, and point me to a logic error where I was trying to add a distance to a velocity because of a careless placement of brackets. Without the type-checking, I'm not sure how long I would have spent tracking the bug.

Cool :)
How does that work? do you have an example?
Formally, the idea is that units of measure are just types in a free Abelian group generated by base units. Less pompously, you create a dimension type by stringing together base units and identifying types such as ms2s-1 and smm-1m.

This is fairly trivial so far. Things get really interesting in F# because dimension types are inferred and can contain type variables. This means we can (optionally) define some base types/derived types, such as:

Code: Select all
> [<Measure>] type m
> [<Measure>] type s


And then start writing functions such as

Code: Select all
let f (x : float<m>) (y : float<s>) = x * y


The interpreter (or compiler) infers that the return type of f is float<ms>. However, I can also write a polymorphic function:

Code: Select all
let f (x : float<'u>) (y : float<'v>) (z : float<'w>) = x * z + 3.0<m>*z / y
Warning: This construct causes code to be less generic than indicated by the type annotations. The unit-of-measure variable 'u has been constrained to be measure m/'v

val f : float<'u> -> float<m/'u> -> float<'w> -> float<'u 'w>


Here, the interpreter infers a more specific type than the one I specified. The second argument must be such that when multiplied by the first argument, the result has unit m. Thus:

Code: Select all
> f 1.0<m s> 5.0</s> 10.0<s^2>
val it : float<m s ^ 3> = 16.0

> f 1.0<m s> 5.0<s> 10.0<s^2>
                        ^^^^^
Type mismatch. Expecting a float</s> but given a float<s>


And of course, when we come to write functions which call f, we don't need any type annotations at all:

Code: Select all
> let g x y = f x y y;;
val g : float<'u> -> float<m/'u> -> float<m>

> let g x = f x x // Note that this is a partial application
                ^
Type mismatch. Expecting a float<m/'u> but given a float<'u>


All this is particularly sexy in Visual Studio, where hovering the cursor over an identifier as you write a function will bring up a tooltip with its inferred type, including any unit of measure.

Finally, I should point out that unit of measure variables can be type constructor arguments. Thus, it is possible to define, say, a polymorphic 3d vector type:

Code: Select all
type vector<[<Measure>] 'a> = float<'a> * float<'a> * float<'a>


The only weaknesses of units of measure I've encountered arise due to type-erasure, but I can't remember the particular issues off-hand. I remember them being relatively easy to circumvent though. There is also the obvious problem that functions such as "sqrt", as implemented in any programming language, is never going to have the correct unit of measure. So in these cases, you have to cast.

VazScep wrote:Dll are not directly supported by neither the C nor the C++ standard, and how to implement them is very platform dependent.
Err...I had no idea about that. Damn.

It does not change anything anyway since there is no standard abi for c++ code, and so when you write dll's your only real option is to export functions as C (with extern "C"). You could do otherwise, but then if you tried to create an executable in visual studio which uses a dll created in g++, it would just not work, most likely it would segfault.
Seriously did not know this. I've been complaining about the difficulties in binding to a C++ dll, and the pain of having to wrap everything up as extern C functions. I had no idea C++ programmers are in the exact same situation!
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#129  Postby mizvekov » Mar 25, 2012 10:59 pm

VazScep wrote:...
The only weaknesses of units of measure I've encountered arise due to type-erasure, but I can't remember the particular issues off-hand. I remember them being relatively easy to circumvent though. There is also the obvious problem that functions such as "sqrt", as implemented in any programming language, is never going to have the correct unit of measure. So in these cases, you have to cast.

That is very cool :) . Need to bump F# up in my priorities list.
By the way, I think something similar could be implemented in c++ with a new feature called "user defined literals".
Code: Select all
double operator "" _yard(long double val)    { return val*0.9144; } // converts to meters
int main() { std::cout << 12.3_yard << std::endl; }

Now I think that with a good type hierarchy and some overloading of the arithmetic operators, it should be possible to implement something similar, but perhaps not as powerful with regards to type inference.

VazScep wrote:Seriously did not know this. I've been complaining about the difficulties in binding to a C++ dll, and the pain of having to wrap everything up as extern C functions. I had no idea C++ programmers are in the exact same situation!
Yes, unfortunately so :)
That is specially harder on linux and other unix systems, where there are no commonplace equivalents to COM.

By the way, still haven't forgotten about the propositional logic system we were discussing, it's just that I have been busy with more pressing matters :whistle:
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#130  Postby mizvekov » Apr 03, 2012 10:04 am

Still working on that goddamn example! :grin:
By the way, I found that having a theorem and a function like this helps a lot:
Code: Select all
-- |- (p ==> q) ==> (q ==> r) ==> p ==> r
hs :: Theorem String
hs = let
   step1  = Assume p
   step2  = Assume $ p :=>: q
   step3  = MP step2 step1
   step4  = Assume $ q :=>: r
   step5  = MP step4 step3
   step6  = discharge p step5
   step7  = discharge (q :=>: r) step6
   in verify step7

hyp_sylo :: Proof -> Proof -> Proof
hyp_sylo a b = let
   aT = concl a
   bT = concl b
   in case aT of
      p :=>: q1 ->
         case bT of
            q2 :=>: r ->
               if (q1 == q2) then
                  let step1 = matchMP (instL [p, q1, r] hs) a
                  in MP step1 b
               else error "hyp_sylo unmatched"
            _ -> error "hyp_sylo second"
      _ -> error "hyp_sylo first"

Am I on the right track here? :)
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#131  Postby epepke » Apr 03, 2012 10:41 am

VazScep wrote:Formally, the idea is that units of measure are just types in a free Abelian group generated by base units. Less pompously, you create a dimension type by stringing together base units and identifying types such as ms2s-1 and smm-1m.


This is reasonably easy to do in C++ by overloading the multiplication operator. I've done a units package based on this. A simple example:

Code: Select all
Distance d = 3 * Meter;


This is pretty useful.

There are a few problems, though.

1) Some units are funny, such as degrees Fahrenheit and Celsius, which requires an offset. The HDF system expressed units as a multiplier and an offset, which gets rid of this particular problem, though it cannot handle units that have, say, logarithms. Most of these problems can be kluged in C++ by overloading conversion functions.

2) It doesn't really distinguish very well between absolute and relative units. So, for example,

Code: Select all
2 * DegreesCelsius + 5 * DegreesCelsius[/quote]

will be understood by a person as meaning 7 degrees Fahrenheit, but that isn't obvious from the definition.  Still, that can be kluged reasonably well by overloading addition and subtraction.  You cannot just fix this by doing all calculation in, say, DegreesKelvin and converting back.

3) It doesn't easily handle all the conventions that people have.  For example,

[code]5 * Dollars + 20 * Percent


would be understood by a person as (5 * (1 + 0.20)) * Dollars. It isn't obvious how to do it, except by overloading + and -. To do it requires writing a lot of functions. I'd do this by writing some code in Perl to emit C++ code for all (expected) possibilities.

4) Getting an automatic type for a complex expression pretty much relies on the optimizations from the compiler. These vary. Also, for debugging, one sometimes has to drop to a lower level of optimization, which results in a lot of swearing.

5) Related to 4, the compiler cannot generally assume and Abelian group. For example, you might want to overload * for something like tensors, the multiplication of which is not commutative.

Still, it's pretty useful, and can avoid problems like that Mars lander that crashed because one unit was working in feet and another in meters. It can also be set either to do conversions automatically, for debugging, or a zero-cost deployment, producing compiler errors.
User avatar
epepke
 
Posts: 4080

Country: US
United States (us)
Print view this post

Re: Programming

#132  Postby VazScep » Apr 05, 2012 2:27 pm

epepke wrote:This is reasonably easy to do in C++ by overloading the multiplication operator. I've done a units package based on this. A simple example:

Code: Select all
Distance d = 3 * Meter;


This is pretty useful.

There are a few problems, though.

1) Some units are funny, such as degrees Fahrenheit and Celsius, which requires an offset. The HDF system expressed units as a multiplier and an offset, which gets rid of this particular problem, though it cannot handle units that have, say, logarithms. Most of these problems can be kluged in C++ by overloading conversion functions.

2) It doesn't really distinguish very well between absolute and relative units. So, for example,

Code: Select all
2 * DegreesCelsius + 5 * DegreesCelsius[/quote]

will be understood by a person as meaning 7 degrees Fahrenheit, but that isn't obvious from the definition.  Still, that can be kluged reasonably well by overloading addition and subtraction.  You cannot just fix this by doing all calculation in, say, DegreesKelvin and converting back.

3) It doesn't easily handle all the conventions that people have.  For example,

[code]5 * Dollars + 20 * Percent


would be understood by a person as (5 * (1 + 0.20)) * Dollars. It isn't obvious how to do it, except by overloading + and -. To do it requires writing a lot of functions. I'd do this by writing some code in Perl to emit C++ code for all (expected) possibilities.

4) Getting an automatic type for a complex expression pretty much relies on the optimizations from the compiler. These vary. Also, for debugging, one sometimes has to drop to a lower level of optimization, which results in a lot of swearing.

5) Related to 4, the compiler cannot generally assume and Abelian group. For example, you might want to overload * for something like tensors, the multiplication of which is not commutative.

Still, it's pretty useful, and can avoid problems like that Mars lander that crashed because one unit was working in feet and another in meters. It can also be set either to do conversions automatically, for debugging, or a zero-cost deployment, producing compiler errors.
But the achievement of F# lies in getting a full type-inference algorithm for polymorphic values where the dimensions can be variable.
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#133  Postby VazScep » Apr 05, 2012 2:35 pm

mizvekov wrote:Still working on that goddamn example! :grin:
By the way, I found that having a theorem and a function like this helps a lot:
I think I can see that working.

Unfortunately, I don't have my code to hand at the moment. But I can throw a lemma your way if you want a hint.
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#134  Postby mizvekov » Apr 05, 2012 4:53 pm

VazScep wrote:
mizvekov wrote:Still working on that goddamn example! :grin:
By the way, I found that having a theorem and a function like this helps a lot:
I think I can see that working.

Unfortunately, I don't have my code to hand at the moment. But I can throw a lemma your way if you want a hint.

Okay, these are the lemmas I have so far:
Code: Select all
-- |- ~p ==> p ==> q
lemma1 = let
   step1  = Assume $ Not p
   step2  = instL [p, Not q] axiom1
   step3  = matchMP step2 step1
   step4  = matchMP axiom3 step3
   in verify step4

-- |- ~~p ==> p
lemma2 = let
   step1  = Assume $ Not $ Not p
   step2  = instL [p, Not $ Not p] lemma1
   step3  = matchMP step2 step1
   step4  = instL [p, Not $ Not p] axiom3
   step5  = matchMP step4 step3
   step6  = MP step5 step1
   in verify $ step6

-- |- p ==> ~~p
lemma3 = let
   step1  = instL [Not p] lemma2
   step2  = instL [Not $ Not p, p] axiom3
   step3  = mp step2 step1
   in head $ step3

-- |- (p ==> q) ==> ~q ==> ~p
lemma4 = let
   step1  = Assume $ p :=>: q
   step2  = UseTheorem lemma2
   step3  = hyp_sylo step2 step1
   step4  = UseTheorem $ instL [q] lemma3
   step5  = hyp_sylo step3 step4
   step6  = instL [Not p, Not q] axiom3
   step7  = matchMP step6 step5
   in verify $ step7

-- |- (~p ==> q) ==> ~q ==> p
lemma5 = let
   step1  = Assume $ Not p :=>: q
   step2  = matchMP (instL [Not p, q] lemma4) step1
   step3  = Assume $ Not q
   step4  = MP step2 step3
   step5  = matchMP lemma2 step4
   step6  = discharge (Not q) step5
   in verify $ step6

I think I should be able to find a proof for it now. I will try to finish this off tomorrow.

epepke wrote:
This is reasonably easy to do in C++ by overloading the multiplication operator. I've done a units package based on this. A simple example:
...

By the way, I was curious and checked if boost already supported something like this, and yes indeed it does!
Based on an example in: http://www.boost.org/doc/libs/1_49_0/do ... Start.html
Code: Select all
        quantity<force>     F(2.0 * newton);
        quantity<length>    dx(2.0 * meter);
        auto E = F * dx;

        std::cout << "F  = " << F << std::endl;
        std::cout << "dx = " << dx << std::endl;
        std::cout << "E  = " << E << std::endl;

Prints:
Code: Select all
F  = 2 N
dx = 2 m
E  = 4 J
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#135  Postby VazScep » Apr 05, 2012 5:36 pm

mizvekov wrote:
VazScep wrote:
mizvekov wrote:Still working on that goddamn example! :grin:
By the way, I found that having a theorem and a function like this helps a lot:
I think I can see that working.

Unfortunately, I don't have my code to hand at the moment. But I can throw a lemma your way if you want a hint.

Okay, these are the lemmas I have so far:
[code]-- |- ~p ==> p ==> q
lemma1 = let
step1 = Assume $ Not p
step2 = instL [p, Not q] axiom1
step3 = matchMP step2 step1
step4 = matchMP axiom3 step3
in verify step4
:grin: I'm seriously happy about this! I've no idea whether my solution is optimal, but you seem to have picked the same strategy, so that's cool. Basically, we want the fact that contradictions imply *anything*. If you have p and you have its negation, then you have arbitrary q.

-- |- ~~p ==> p
lemma2 = let
step1 = Assume $ Not $ Not p
step2 = instL [p, Not $ Not p] lemma1
step3 = matchMP step2 step1
step4 = instL [p, Not $ Not p] axiom3
step5 = matchMP step4 step3
step6 = MP step5 step1
in verify $ step6

-- |- p ==> ~~p
lemma3 = let
step1 = instL [Not p] lemma2
step2 = instL [Not $ Not p, p] axiom3
step3 = mp step2 step1
in head $ step3
Yeah, you're damn close at this point. And you don't want these as mere lemmas. These two give you the double-negation elimination conversion.

I think I should be able to find a proof for it now. I will try to finish this off tomorrow.
Easily. Cheers for getting into this stuff. I'm seriously impressed.

By the way, I was curious and checked if boost already supported something like this, and yes indeed it does!
I *believe* a templated dimensional analysis system is one of the case-studies in Alexandrescu's "Modern C++ Design". A friend of mine who worked in the physics department for a major game developer said they used the pattern a lot, but he said he also found it so verbose as to be almost useless. The big sell of the F# solution is in the fact that you write virtually no unit annotations and you get polymorphism. However, it's not going to be a serious shock to me to hear that the Boost folk figured that out too.
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#136  Postby mizvekov » Apr 06, 2012 7:33 pm

VazScep wrote::grin: I'm seriously happy about this! I've no idea whether my solution is optimal, but you seem to have picked the same strategy, so that's cool. Basically, we want the fact that contradictions imply *anything*. If you have p and you have its negation, then you have arbitrary q.

Yeah, I have no idea if it's optimal either. Frankly it's the first move I could make, I can't even think of another way.

Yeah, you're damn close at this point. And you don't want these as mere lemmas. These two give you the double-negation elimination conversion.
Yeah, I still have to try that conversions language you posted though.

Easily. Cheers for getting into this stuff. I'm seriously impressed.

Well, so here it is, finally :thumbup:
Code: Select all
-- |- (~p ==> q) ==> (~p ==> ~q) ==> p
main = let
   step1  = Assume $ Not p :=>: q
   step2  = matchMP lemma5 step1
   step3  = Assume $ Not p :=>: Not q
   step4  = hyp_sylo step3 step2
   step5  = let
      step5_1 = instL [p, Not (q :=>: p)] lemma1
      step5_2 = instL [Not p, p, Not (q :=>: p)] axiom2
      in head $ mp step5_2 step5_1
   step6  = matchMP step5 step4
   step7  = instL [p, q :=>: p] axiom3
   step8  = matchMP step7 step6
   step9  = matchMP axiom3 step3
   step10 = MP step8 step9
   step11 = discharge (Not p :=>: Not q) step10
   in print $ printThm $ verify $ step11

Now, damn it, that took a while. :)
I am curious though about how much our proofs differ.

I *believe* a templated dimensional analysis system is one of the case-studies in Alexandrescu's "Modern C++ Design". A friend of mine who worked in the physics department for a major game developer said they used the pattern a lot, but he said he also found it so verbose as to be almost useless. The big sell of the F# solution is in the fact that you write virtually no unit annotations and you get polymorphism. However, it's not going to be a serious shock to me to hear that the Boost folk figured that out too.

Oh, it's a shame I haven' t yet read that book, I definitely should have and there are no excuses.
I have no idea how complicated the boost library is if you need to create custom base units, but it seems simple enough if all you need are the common ones. Guess I' ll have to try it.
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#137  Postby VazScep » Apr 07, 2012 7:55 am

mizvekov wrote:I am curious though about how much our proofs differ.
So here's mine:

Code: Select all
-- ⊢ P → P
truth :: Theorem String
truth = let step1 = instL [p, p :=>: p] axiom1
            step2 = instL [p, p :=>: p, p] axiom2
            step3 = head $ mp step2 step1
            step4 = instL [p, p] axiom1 in
        head $ mp step3 step4

-- ⊢ ¬P → P → ⊥
lemma1 :: Theorem String
lemma1 = let step1 = UseTheorem (instL [Not p, Not f] axiom1)
             step2 = Assume (Not p)
             step3 = MP step1 step2
             step4 = matchMP axiom3 step3
         in verify step4

-- ⊢ (¬P → Q) -> (¬P → ¬Q) → P
-- This is Mendelson's axiom3.
mendelson :: Theorem String
mendelson = let step1  = Assume (Not p :=>: Not q)
                step2  = Assume (Not p :=>: q)
                step3  = Assume (Not p)
                step4  = MP step1 step3
                step5  = MP step2 step3
                step6  = matchMP lemma1 step4
                step7  = MP step6 step5
                step8  = discharge (Not p) step7
                step9  = matchMP axiom3 step8
                step10 = UseTheorem (instL [q] truth)
                step11 = MP step9 step10
                step12 = discharge (concl step2) step11
            in verify step12


My lemma isn't quite the same as yours. I basically introduced the notion of false (⊥ or f) as the negation of the tautology "P → P". But I can then easily use this to derive an arbitrary proposition.

Oh, it's a shame I haven' t yet read that book, I definitely should have and there are no excuses.
It's definitely going on my reading list if I ever land a C++ job. I'm thinking I'll go through Scott Meyers again, and then probably Sutter and Coplien too.
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

Re: Programming

#138  Postby mizvekov » Apr 07, 2012 5:10 pm

VazScep wrote:My lemma isn't quite the same as yours. I basically introduced the notion of false (⊥ or f) as the negation of the tautology "P → P". But I can then easily use this to derive an arbitrary proposition.

I see :) .
Well I was set on a longer strategy from the start. I saw that I could arrive on (~p ==> p) from the premises
(~p ==> q), (~p ==> ~q) by transforming it into (~p ==> q), (q ==> p) and then applying hypothetical syllogism. After that I just had to prove the tautology "(~p ==> p) ==> p" but that one is much harder. It seems in fact easier to start from mendelson's 3rd to prove it, instead.

It's definitely going on my reading list if I ever land a C++ job. I'm thinking I'll go through Scott Meyers again, and then probably Sutter and Coplien too.

By the way, you should definitely check out the D language sometime. It's a huge improvement over C++ IMHO. I think it's the project Alexandrescu is most involved with right now.
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#139  Postby mizvekov » Apr 13, 2012 5:47 pm

VazScep wrote:
Code: Select all
-- Basic conversions and conversionals

module Conversions where

import Bootstrap hiding (matchMP)
import Propositional
import Utils
import Control.Monad.Error

-- Conversions wrap functions which take terms and derive equations. Each
-- equation has an lhs which is the original term.
newtype Conv a = Conv (Term a -> [Theorem a])

applyC :: Conv a -> Term a -> [Theorem a]
applyC (Conv c) term = c term

-- Use a conversion to rewrite a theorem
convMP :: Conv String -> Theorem String -> [Theorem String]
convMP c thm = do eq  <- applyC c (theoremTerm thm)
                  imp <- matchMP eqMP eq
                  mp imp thm

-- Apply a conversion to the antecedent of a conditional
antC :: Conv String -> Conv String
antC (Conv c) = Conv f
  where f (p :=>: q) = c p >>= matchMP substLeft'
          where substLeft' = instM (Var . avoid (vars q)) [("p",q)] substLeft
        f _          = []
 
-- Apply a conversion to the consequent of a conditional       
conclC :: Conv String -> Conv String
conclC (Conv c) = Conv f
  where f (p :=>: q) = c q >>= matchMP substRight'
          where substRight' = instM (Var . avoid (vars p)) [("p",p)] substRight
        f _          = []

-- Apply a conversion to the body of a negation
notC :: Conv String -> Conv String
notC (Conv c) = Conv f
  where f (Not p) = c p >>= matchMP substNot
        f _       = []
       
-- Attempt the left conversion, and if it fails, attempt the right
orC :: Conv a -> Conv a -> Conv a
orC (Conv c) (Conv c') = Conv f
  where f t = c t `mplus` c' t
       
-- Try all conversions which succeed
tryC :: [Conv a] -> Conv a
tryC = foldr orC failC

-- Apply a conversion based on the term to be converted
ifC :: (Term a -> Bool) -> Conv a -> Conv a
ifC p c = Conv (\t -> if p t then applyC c t else [])

-- Apply one conversion after another
thenC :: Conv String -> Conv String -> Conv String
thenC c c' = Conv f
  where f t = do thm   <- applyC c t
                 r     <- rhs (theoremTerm thm)
                 thm'  <- applyC c' r
                 thm'' <- matchMP trans thm
                 matchMP thm'' thm'
        rhs (Not ((p :=>: q) :=>: Not (r :=>: s))) | p == s && q == r = return q
        rhs _  = []
       
-- The zero for orConv and identity for thenConv: always succeeds
allC :: Conv String
allC = Conv (\t -> return $ instL [t] reflEq)

-- The identity for orConv and zero for thenConv: always fails
failC :: Conv a
failC = Conv (const [])

-- Sequence conversions
everyC :: [Conv String] -> Conv String
everyC = foldl thenC allC
       
isImp :: Term a -> Bool
isImp (p :=>: q) = True
isImp _          = False

isNeg :: Term a -> Bool
isNeg (Not p) = True
isNeg _       = False

-- Apply a conversion to the first applicable term on a depth first search
depthConv1 :: Conv String -> Conv String
depthConv1 c = tryC [ c
                    , ifC isNeg (notC c')
                    , ifC isImp (antC c' `orC` conclC c') ]
  where c' = depthConv1 c
This is a pretty rich combinator language already, with some nice algebraic structure, but the goal here is to be able to write lots of new more sophisticated conversions by composing the primitive conversions. It's kind of overkill for propositional logic, but when you're working in higher-order theories, you want to use this sort of language to build powerful rewrite based automation and domain specific rewriting automation.

Hey, trying to give this one a go, but it fails to compile, several missing symbols:
Code: Select all
Conversions.hs:20:26: Not in scope: `matchMP'
Conversions.hs:20:34: Not in scope: `eqMP'
Conversions.hs:26:32: Not in scope: `matchMP'
Conversions.hs:27:69: Not in scope: `substLeft'
Conversions.hs:33:32: Not in scope: `matchMP'
Conversions.hs:34:70: Not in scope: `substRight'
Conversions.hs:40:29: Not in scope: `matchMP'
Conversions.hs:40:37: Not in scope: `substNot'
Conversions.hs:62:27: Not in scope: `matchMP'
Conversions.hs:62:35: Not in scope: `trans'
Conversions.hs:63:18: Not in scope: `matchMP'
Conversions.hs:69:39: Not in scope: `reflEq'
mizvekov
 
Posts: 314
Age: 37
Male

Brazil (br)
Print view this post

Re: Programming

#140  Postby VazScep » Apr 14, 2012 12:45 pm

mizvekov wrote:Hey, trying to give this one a go, but it fails to compile, several missing symbols:
Code: Select all
Conversions.hs:20:26: Not in scope: `matchMP'
Conversions.hs:20:34: Not in scope: `eqMP'
Conversions.hs:26:32: Not in scope: `matchMP'
Conversions.hs:27:69: Not in scope: `substLeft'
Conversions.hs:33:32: Not in scope: `matchMP'
Conversions.hs:34:70: Not in scope: `substRight'
Conversions.hs:40:29: Not in scope: `matchMP'
Conversions.hs:40:37: Not in scope: `substNot'
Conversions.hs:62:27: Not in scope: `matchMP'
Conversions.hs:62:35: Not in scope: `trans'
Conversions.hs:63:18: Not in scope: `matchMP'
Conversions.hs:69:39: Not in scope: `reflEq'
Ah, yeah. There's loads of other stuff in my Bootstrap.hs file. I provided matchMP a few posts back

Code: Select all
-- matchMP (P → Q) (Γ ⊢ P') attempts to match P with P', and then applies MP.
matchMP :: Theorem String -> Proof -> Proof
matchMP imp ant =
  let antT = concl ant in
    case theoremTerm imp of
      p :=>: q ->
        case match p antT of
          [insts] -> MP (UseTheorem $ instM Var insts imp) ant
          _       -> error "MATCH MP"
      _ -> error "MATCH MP"     


The rest are all theorems:

Code: Select all
-- ⊢ (P → Q → R) ↔ (P ∧ Q → R)
uncurry :: Theorem String

-- ⊢ ¬¬P → P
dblNegElim :: Theorem String

-- ⊢ P → ¬¬P               
dblNegIntro :: Theorem String

-- ⊢ P → Q → P ∧ Q
conjI :: Theorem String

-- ⊢ (X ↔ Y) → X → Y
eqMP :: Theorem String

-- ⊢ (X ↔ X)
reflEq :: Theorem String

-- ⊢ (X ↔ Y) ↔ (Y ↔ X)
symEq :: Theorem String

-- ⊢ (X ↔ Y) → (Y ↔ Z) → (X ↔ Z)
trans :: Theorem String

-- ⊢ (X ↔ Y) → (X → P ↔ Y → P)
substLeft :: Theorem String

-- ⊢ (X ↔ Y) → (P → X ↔ P → Y)
substRight :: Theorem String

-- ⊢ (X ↔ Y) → (¬X ↔ ¬Y)
substNot :: Theorem String
I can just send you the whole file if you'd like, though I had some fun getting these proofs.
Here we go again. First, we discover recursion.
VazScep
THREAD STARTER
 
Posts: 4590

United Kingdom (uk)
Print view this post

PreviousNext

Return to Philosophy

Who is online

Users viewing this topic: No registered users and 1 guest