HN Books @HNBooksMonth

The best books of Hacker News.

Hacker News Comments on
Constraint Programming in Music

Charlotte Truchet, Gerard Assayag · 2 HN comments
HN Books has aggregated all Hacker News stories and comments that mention "Constraint Programming in Music" by Charlotte Truchet, Gerard Assayag.
View on Amazon [↗]
HN Books may receive an affiliate commission when you make purchases on sites after clicking through links on this page.
Amazon Summary
Constraint programming (CP) is a declarative programming paradigm with many academic and industrial applications (from n-queens to planning, vehicle routing, and optimization, among other fields). Music composition has been one of these applications since the earliest works on automatic harmonization, and it remains a very special and challenging one due to its artistic (and highly subjective) nature. The early works on CP in music were limited to classical music composition, as the harmonization and counterpoint rules naturally translate into constraints. However, when contemporary composers began to be interested in constraints, CP became an essential tool in computer-assisted composition systems. As several contemporary musical pieces have now been composed "with constraints", it is reasonable to ask why CP applies so naturally to music, and what the particular features of musical problems are. This book presents information about recently developed musical CP systems from both the scientist's and composer's point-of-view. It will therefore be of interest to students and researchers of music technology, composers in the computer music scene, and music software companies-especially those trying to model high level musical behaviors (i.e., intelligent arpeggiation/arrangement on synthesizers, "Band in a Box" software, etc.), perform music data mining, and execute music taste engineering for online music delivery.
HN Books Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this book.
If you'd prefer to take this discussion to email we can.

> Thanks for the post, the pull requests, and the questions!

You are very welcome, lol. There might be more coming as time permits. I'll keep an eye out on the progress.

> I'm not promising, though, that the answers aren't ridiculous/stupid! :-)

I've always felt like people are too afraid about not sharing their ridiculous ideas anyway :-).

> I suppose the standard answer is: rule-based systems that have to perform some type of logical inference.

I think that my question was a bit too ambiguous. I think that I meant like an area where logic programming isn't currently commonly used (to me, the network configuration tool seems very out of place for example) but it would be a match made in heaven. I'm aware of expert systems :-).

> In the 1980s logic programming was mainstream, at least in Japan!

I'm actually aware but for some reason it's not mainstream now.

> What hasn't taken off again is a full, general purpose logic programming language, like Prolog.

That's what I was asking about. E.g. would it make sense to e.g. write an OS in a logic/relational language? What sort of CPU support would be required to make this a serious effort? I think what I'm getting at is that I feel like e.g. Prolog isn't truly general purpose. I get that it's turing complete but large portion of programs are much slower in prolog (as you say yourself). Is this something that's "inherent" to logic languages (like even if HW support were better, will it always (i know i know, no absolutes but you get the gist) be MUCH slower or could it be just slightly slower)?

Like I realize that they are slower because they are fundamentally doing a search but I'm wondering if e.g (and I'm really just trowing ideas on the wall here) it's possible to leverage type systems and hw support to considerably reduce the runtime search space and do most of it on compile time. Are you familiar with liquid types (you mention dependent types, so I'm guessing yes?)?

In your work, you seem to be gravitating towards dynamically typed languages. Wouldn't a lot of work be easier in like Haskell or some other type language? I'm sure you have good reasons for not having switched and I'm interested in hearing those. Why are Lisps better for your work?

> "Never" is a very serious word! :-) I don't think we know enough about any form of programming to use words like "always" and "never."

I was actually thinking about wording it slightly less absolutely but I was interested if there were any problems where logic programming would be a bad idea no matter what so it would be basically "never".

> Unless some language/killer-application combination appears out of the blue (think Ruby/Rails),

I think that's part of what I was asking. What would be a "killer" app for logic programming? Are there any problems out there that aren't currently being solved with logical programming but would really benefit from them?

Btw can you recommend some literature on the intersection of logic/constraint/relational programming and music? I've ordered this book https://amazon.com/dp/1848212887/

I'll check out the links you posted. I'm actually working full-time on this music composition app (logic/relational programming isn't the main contribution of the application but I'm just trying to see if there's some low hanging fruit I could do easily and blow minds). If you have any sort of input into that I'm be interested. I saw on your CV that you supervised some undergrad research in this area so I'm guessing that you might not be unfamiliar with that intersection.

I did see like the top google links when I searched for the combination of those keywords (e.g. https://vimeo.com/55677313) so I think I'm interested in some sort of meta analysis.

will_byrd
> If you'd prefer to take this discussion to email we can.

Sure. I'll take a crack at these followup questions first, though!

> I think that my question was a bit too ambiguous. I think that I meant like an area where logic programming isn't currently commonly used (to me, the network configuration tool seems very out of place for example) but it would be a match made in heaven. I'm aware of expert systems :-).

It doesn't surprise me to see Prolog in a configuration system, since configuration is a standard use of expert systems--for example, Xcon (https://en.wikipedia.org/wiki/Xcon). Getting a manager at Microsoft to allow the use of Prolog is the surprising thing to me! :)

> That's what I was asking about. E.g. would it make sense to e.g. write an OS in a logic/relational language? What sort of CPU support would be required to make this a serious effort? I think what I'm getting at is that I feel like e.g. Prolog isn't truly general purpose. I get that it's turing complete but large portion of programs are much slower in prolog (as you say yourself). Is this something that's "inherent" to logic languages (like even if HW support were better, will it always (i know i know, no absolutes but you get the gist) be MUCH slower or could it be just slightly slower)?

You can play all sorts of "extra-logical" tricks in Prolog to improve performance, at the cost of giving up declarativeness. Still, Prolog is not the language I'd reach for first when writing system software, as you point out.

Have you looked at Mercury? Mercury is a logic programming language that borrows heavily from Haskell. One cool thing about Mercury is that you define high-level relations, specifying types and other annotations for each relation. The Mercury compiler then compiles the high-level declarative relations into lower level, more efficient functional definitions. One high-level relation might be compiled to many functions, which are dispatched to at run-time based on the groundness of the arguments passed to the relation. Mercury is specifically designed for system programming, and also has support for parallelism.

I should point out that Mercury relations aren't really full relations, like they are in miniKanren. Mercury compiles away unifications for efficiency, which means that Mercury can't handle more sophisticated uses of logic variables. For example, the quine-generating query would be impossible to specify in Mercury. But the resulting code is much more efficient.

Greg Rosenblatt has applied a few of the Mercury compiler techniques to the relational interpreter used by Barliman, which has sped up some synthesis queries by orders of magnitude. He suspects there are more gains to be made from borrowing Mercury optimizations.

> Like I realize that they are slower because they are fundamentally doing a search but I'm wondering if e.g (and I'm really just trowing ideas on the wall here) it's possible to leverage type systems and hw support to considerably reduce the runtime search space and do most of it on compile time. Are you familiar with liquid types (you mention dependent types, so I'm guessing yes?)?

I'm not sure what you mean by hardware support, exactly. Are you talking about using accelerators like GPUs or FPGAs? If so, as far as I can tell miniKanren doesn't map well onto these accelerators, although the Intel Phi might be a decent match.

I suspect that running synthesis tasks on a beefy server, or even on a cluster, would result in a significant improvement to performance. Static analysis and compiler optimizations would probably make a much bigger difference, though.

I'm slightly familiar with liquid types. There may be ways for dependent types to prune parts of the search space statically, although I'm not sure. I'd like to explore combining miniKanren and Idris, to see where that leads.

> In your work, you seem to be gravitating towards dynamically typed languages. Wouldn't a lot of work be easier in like Haskell or some other type language? I'm sure you have good reasons for not having switched and I'm interested in hearing those. Why are Lisps better for your work?

I like Scheme because it has very powerful meta-programming capabilities, it is easy to write meta-circular interpreters, it is trivial to write DSLs (without having to write parsers or whatever), and the abstract syntax and concrete syntax are essentially the same. Being able to just call 'read' to parse an expression, and being able to redefine built-in binding forms, for example, can be extremely handy.

Haskell is a language chock full of interesting ideas. I don't love it, though. The syntax, lack of macros, all the extensions in GHC, etc., make for a language that is powerful but complex. I like languages that are powerful but small, containing only a few concepts. miniKanren itself only has three core operators; the relational interpreter uses an extended version of miniKanren with six operators, but we hope to get that down to five.

In short, Scheme fits my programming style better. I'm not averse to using other languages, though. And miniKanren has been ported to dozens of host languages, including multiple Haskell implementations.

As far as the types making things easier, I'm not sure this is true. Many ML and Haskell programmers have looked at miniKanren and said, "this would be so much better with types!" After playing around with the implementation for a while, they usually say, "taking advantage of the types without losing the flavor of miniKanren is much harder than I thought." As far as I can tell, the only interesting embedding of miniKanren in a typed language is Dmitry Boulytchev's OCanren (https://github.com/dboulytchev/OCanren). Dmitry had to hack the OCaml type system at a low (and potentially unsafe) level to make the embedding possible. I would be interested in exploring OCanren for writing relational interpreters. I'm not exactly sure what it would buy us, though.

There are lots of interesting connections between dependent types and logic programming. I'd like to explore combining Idris and ideas from miniKanren, for example. To me, combining Idris and miniKanren seems more interesting and promising than combining Haskell and miniKanren, which has been done many times without apparent benefit.

Now, in contrast to the host language used for the embedding of miniKanren, implementing a typed language in the relational interpreter definitely makes synthesis easier. In other words, implementing a relational interpreter for an ML-like language, using miniKanren embedded in Scheme, is probably far better for synthesis than implementing a relational interpreter for Scheme, using miniKanren embedded in Haskell.

I definitely plan to investigate relational interpreters for typed languages (Michael Ballantyne has already done some work in this area, and showed great improvements in synthesis). I'm also open to exploring miniKanren's embedded in typed languages. This seems like a much trickier problem than you might think, though. OCanren looks like a promising start, and I look forward to playing with it more.

> I was actually thinking about wording it slightly less absolutely but I was interested if there were any problems where logic programming would be a bad idea no matter what so it would be basically "never".

I think this largely depends on what you consider "logic programming". Mercury seems applicable to systems programming problems that aren't an ideal match for traditional Prolog, for example. And Curry allows functional programming and logic programming to be blended in interesting ways that blur the lines between the paradigms.

There are also approaches like Answer Set Programming, and even Answer Set Programming Modulo Theories, that use powerful solvers to solve problems that would be hopelessly inefficient with naive Prolog (or miniKanren) search.

So I'll stick with my original answer, and say it's far too soon to tell. :) We do need new langauges and approaches, though!

> I think that's part of what I was asking. What would be a "killer" app for logic programming? Are there any problems out there that aren't currently being solved with logical programming but would really benefit from them?

Such a tiny percentage of programmers have even heard about Answer Set Programming; it's possible there are killer apps for ASP laying around, just waiting for someone to connect the ideas.

I suspect killer apps will be found for probabilistic logic programming, as the languages and implementations improve, and as more people become comfortable with probabilistic programming.

And, of course, I'm hoping program synthesis using logic programming will be a killer app! :)

> Btw can you recommend some literature on the intersection of logic/constraint/relational programming and music? I've ordered this book https://amazon.com/dp/1848212887/ > I'll check out the links you posted. I'm actually working full-time on this music composition app (logic/relational programming isn't the main contribution of the application but I'm just trying to see if there's some low hanging fruit I could do easily and blow minds). If you have any sort of input into that I'm be interested. I saw on your CV that you supervised some undergrad research in this area so I'm guessing that you might not be unfamiliar with that intersection. > I did see like the top google links when I searched for the combination of those keywords (e.g. https://vimeo.com/55677313) so I think I'm interested in some sort of meta analysis.

I'd be happy to talk to you about this in more depth, maybe via hangouts. I think it depends very much on what you want to do. For example, synthesis of music that pleases humans is still an open problem. But other problems seem more tractable.

HN Books is an independent project and is not operated by Y Combinator or Amazon.com.
~ yaj@
;laksdfhjdhksalkfj more things
yahnd.com ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.