Music similarity checkers like this one compare two melodies or rhythmic patterns. The tool returns a score or percentage.
Does a high score mean infringement? Maybe. Does a low score mean no infringement? Maybe.
Frequently Asked Questions
Does this tool tell me whether my song infringes on another song?
No! And we’re sorry, but no tool can do that! Copyright infringement is not so simple; it is not merely a math problem; it is not just a percentage or a similarity score. Copyright infringement is a legal conclusion that depends on whether protectable expression was copied. A similarity percentage that comes out of a tool like this only one measures what I’d call “gross similarity” whereas a similarity percentage that comes out of a tool like this one is a measure of what I’d call “gross similarities” from a mechanical look at notes that can’t distinguish meaningful similarities from meaningless ones, original expression from blue-chip note similarities from scenes à faire. I could go on forever. If there’s a shortcut, I’d have found it. The tool produces smoke. Confirming or dismissing an actual fire is the domain of a forensic musicologist, one with training, judgment, and repertoire knowledge.
What does the similarity percentage actually mean?
Think of it as calculating gross similarity through a series of mechanical filters. The melody comparison transposes both inputs to a common key (that’s actually something I do too), then it effectively writes them on vellum paper, holds them up to the light (that’s what I do sometimes too), and makes some comparisons. It may measure the edit distance, which is a fancy way of asking how many notes you’d have to add, remove, or swap to turn one melody into the other. Fewer edits, closer match. The interval comparison looks at melody differently, measuring the distances between consecutive notes. The contour comparison simplifies the melodic arc to a zoomed-out view of whether the pitches go up, down, or stay the same. And the rhythm comparison quantizes tapped-out notes (snaps them to a fixed grid) and compares the rhythmic values by standard subdivisions. These are all isolated measurements of a single property. It’s data. Data is meaningful, but needs to be properly applied. Music is rarely one property at a time. As Albert Einstein might’ve put it, forensic musicology should be as simple as possible, but no simpler. Notes get their meaning from context, how they relate to each other, and to the harmony, and to the flow of time. You cannot find the musical truth with a tool like this. It would be scammy to suggest otherwise.
If the percentage is high, should I be worried?
Not necessarily. Not from this alone, certainly. I called this kind of thing “gross similarity,” but what you really want is something closer to net similarity, and a forensic musicology analysis would introduce the metaphorical net that filters away what’s less relevant. A high score from a tool that doesn’t consider the originality of the underlying material is a nothing-burger. I’ll try to come up with an illustration on the spot. (Thinks a minute.) Asked myself, “where else can I find the seven-note beginning of “Twinkle, Twinkle, Little Star?” First song that came to mind — had to look up whose it was — Tom Hanks in Bachelor Party, “Why Do Good Girls Like Bad Boys?” If you ran those seven notes through this little “Music Similarity Checker” you’d get a 100% match. concluded that the 100% match across those seven consecutive notes means they and Tom Hanks stole “Twinkle Twinkle?” Well, maybe, but they’d then remember Twinkle is really, really old and whatever. I’m getting off track here. The. Point. Is. people, that the 100% similarity there is pretty meaningless, whereas, conversely, a moderate sounding score on highly original material could still indicate copying. A percentage like this one tells you that the sequences of pitches line up. (It might also ideally have told you that so do the rhythms and the harmonic context, which also starts to sound like a lot, if you’re a “Music Similarity Checker,” but it does not tell you whether those identical notes are diamonds of forensic musicology or lumps of coal. Determining whether a similarity is significant requires expertise and rigor, forensic analysis that no mechanical tool provides.
How is this different from other music copyright checkers?
Fundamentally, it isn’t. When someone builds something that works, I’ll want to use it. In the meantime, any tool that uses the aforementioned edit distance or similar algorithms to compare pitch sequences or rhythmic patterns is performing the same basic gross sort of operation, whether it accepts audio input or uses a virtual piano, for now, makes no difference. The underlying math and its limitations don’t change: a mechanical comparison cannot determine whether copying occurred, or whether the shared elements are protectable, so they can’t know whether the similarity is legally meaningful. The Musicologize Music Similarity Checker is transparent about those limitations. That’s the point of it.
Why can’t a tool do what a forensic musicologist does?
Why can’t a radar gun on a pitcher do what an MLB baseball scout does? It’s amazing enough that it knows a pitcher throws mid-90’s, but it certainly doesn’t know if they can get hitters out. A forensic musicologist evaluates music the way it actually functions, and not merely as isolated sequences of pitch and rhythm. Musical elements derive their meaning from harmonic context, structural placement, interaction with other elements, and even relationship to the broader musical repertoire. I’m gradually running out of ways to illustrate it. A tool treats a three-note figure in a vacuum. A musicologist knows those notes can mean something entirely different at the beginning of a verse over a stable tonic chord than they do introducing a bridge over a pivot chord that modulates to a new key. An algorithm cannot weigh a melodic phrase against other works within and without the repertoire to determine whether it is original expression or a common building block that belongs to no one. These are judgments that require musical training, analytical experience, and knowledge of the case law. They are not reducible to a formula.
Can I use these results in a legal proceeding?
No, and doing so would likely be counterproductive. A similarity percentage from an online tool has no evidentiary value. Courts rely on a forensic musicology expert report, a courtroom-calibrated document that examines the works through exhibits built from forensic notation, parallel transcriptions, and prior art research. It helps to have the heart of a teacher — to build the explanations and illustrations that actually land with a judge or jury who don’t read music. Submitting a mechanical percentage without musicological context will not adeqately support your position. It is more likely to undermine it by focusing on what I’ve termed “gross similarity” when the law knows to ask about net similarity. If you need analysis that holds up under legal scrutiny, you need an expert, not a calculator.
Why offer this tool if it can’t answer the real question?
Because the question people bring to tools like this is one worth taking seriously, even if no algorithm can answer it. If your ears are telling you two songs sound alike and you want to know whether there is fire behind that smoke, a comparison like this can confirm that the similarity has a measurable basis in the melodic or rhythmic content. That is a useful first step. The problem arises when a percentage is treated as a conclusion rather than a starting point. This tool is a starting point. The conclusion requires a conversation with someone who does this work for a living.
If you’re hearing something that concerns you, whether you’re worried someone copied you, worried you might have copied someone, or trying to clear a track before you release it, the next step is a conversation, not another tool. (212) 217-9512.
(212) 217-9512 · Schedule a call · The initial call is free.