Home » Blog » New Ratings System

New Ratings System

x
Bookmark

A number of folks have asked for a more comprehensive ratings system. A few weeks ago I asked everyone for their ideas on some criteria they would like to see and I have tried to present them in a meaningful way. It’s worth mentioning that the criteria cannot be changed after the fact. If I’ve missed anything glaringly obvious we can always start over.

Because larger libraries are, for the most part, not accessible to the average composer, the ratings here will reflect the most accessible, smaller libraries. Also, while any system is subjective and far from perfect, it is at least something!

I will keep the older rating system results available for the time being.

[Update 11-05-2010]: The new ratings system turned out to be more problematic than it was worth so I went back to the old one. I will reiterate what I said above: “Because larger libraries are, for the most part, not accessible to the average composer, the ratings here will reflect the most accessible, smaller libraries. Also, while any system is subjective and far from perfect, it is at least something!”

14 thoughts on “New Ratings System”

  1. [Comments removed by moderator. Same old ranting, no useful suggestions.]

    I won’t comment on this again as I am getting a tad repetitive, and I’m sure annoying alot of ratings lovers.

    [Let’s hope!]

    Reply
    • It’s not meant to be a precise system, but what I think it can do is help us all understand what kind of business any given library is. Are they bad at communication, but is it worth the wait?

      Who knows, maybe it’ll encourage libraries to work on those separate areas, now that they’re being compared and rated on?

      Currently, libraries get a great rating because they are great with communication. I want to know if they’re slow at getting back, but also how much money I’m likely to make. The ratings won’t ever be perfect, but you can rate the ‘actual income generated’ by giving the best library you worked with a 5, the worst libraries a 1. I work with about a dozen libraries, and have tried even more. I’m sure every other composer can rate the ‘income generated’ based off their own experience, comparing the performance of each library they’ve dealt with.

      Reply
  2. You can’t have a useful ratings system based on ad hoc generalities and subjective opinions – in the main seemingly based upon:

    a) ‘they didn’t get back to me so they’re crap’
    b) ‘they got back to me very quickly, they’re great’
    c) ‘the uploader doesn’t work very well’
    d) ‘the uploader was great!’
    e) ‘they rejected my tracks without even giving a reason!’
    f) ‘a brilliant library!!!! – they accepted all 627 of my Peruvian nose flute drones!’

    Now, if you had the time and manpower you could construct a decent system. For example, several years ago when approaching music libraries with showreels, I had no idea what were the decent libraries. I knew nada. So I went onto the PRS database in the UK and there it is possible to see what levels of usage each publisher gets. It is immediately apparent, after a few hours spent browsing on there, who are the good libraries and who will earn you placements and money. Over the subsequent years, this has proved to be accurate info.

    So if you want ratings to actually mean something, in my humble opinion for it to have some worth, it needs to be worthy. And that can only be achieved through a substabntial amount of hard work based on data collection.

    IMO

    Reply
    • I totally agree with you, Jello. The ratings, even with these new ideas, IMHO are ridiculous. When done as they are done here they will always be based on “feel good” factors.

      One gaping flaw in rating libraries is it leaves out anything about the music itself. Whether or not tracks get placed is a function of many things— yes, the attributes of the library (contacts, marketing, organization, etc) are a factor but so is the marketability of the music, how well it fits into a defined category, what requests come up, what type of clients, and a whole lot of ‘right place at the right time’ luck. Should a library that’s been given a low rating here publish a list of all their composers and how they rate their music?

      And right now I’m totally pissed… Only 626 of my 627 Peruvian nose flute drones were accepted by Some Library Out There.com… 🙂

      Reply
      • Hmm… I hope the one they didn’t accept wasn’t the last one you composed. Maybe you’re losing the knack for flute drones. 😀

        Ratings for anything will always be questionable at best. There’s no way to guarantee accurate information. It’s all about opinions. Just accept it for what it is.

        Reply
  3. I think “fairness of contract” could be one of the categories, I think limiting it to 5 cats is correct, as you said we need to be sure of them as they cant be changed. Just thinking aloud, will mull over this further and get back to you. Thanks Art.

    Reply
  4. your doing a great job, its all very helpful. I would like to know about these companies:

    a. fair contract

    b . success rate, placements

    c. type of licenses( tv, commercials, or personal use, etc.

    d. communication, punctual etc.

    Reply

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

X

Forgot Password?

Join Us