Field Report: LinkedIn - A Case Study.
Social Media. As a musician who wants to share his work in public, I am subject to the business of music. The music business used to be rife with human gatekeepers, now there are few. The biggest issue is that, with the flood gates open, you need to stand out from the noise. The common refrain is use social media. Sure, but to play the game well, you need to know the rules.
Rule #1 - Social Media companies (via the algorithm) are the new gatekeepers. Although not human, they are still opaque and serve their master - engagement and virality (aka - Do NOT leave this platform).
Next week we’ll be going broader and talk about the whole playing field. This week let’s focus on LinkedIn. Why? ‘Cause this is my primary social media platform, we’re on right now and I have real data to look at. C’mon, it’ll be fun!
—
My Top Level (Naive?) Expectations: LinkedIn is a professional network with a publishing layer. At a minimum, LinkedIn should support and preserve your existing connections with the possibility of creating new ones. It should:
Show my posts to my 1st-degree connections (those that ‘opted in’ to be my connection)
Deliver my articles to newsletter subscribers (those that ‘opted in’ to receive my newsletter)
Surface my “high-performing” articles to new audiences for discovery
Provide me with useful performance metrics
Give the me (and you) visibility into what is happening and why
LinkedIn “Caveat” (as published by LI): LinkedIn pre-filters content, even to your 1st level connections, before it goes ANYWHERE. It is first screened for spam, then evaluated for “meaningful” engagement potential. After passing those gates is it sorted by relevance based on audience type, content format, and recent member activity.
User visibility into the processes and execution mechanism is nil. You get no actionable information, I get no actionable information. Further, this process happens irrespective of whether we are first level connection. Some might say, “friends.” LI says . . . “Meh”
What is LinkedIn Doing and Why? First, I don’t know and neither do you. We see fingerprints everywhere, but we don’t actually know because neither LinkedIn nor any of the other social media platform share this information. Now, the Meta anti-trust trial has some interesting revelations. Most specifically, it revealed Meta had some significant concerns that prioritizing friend content might cost them valuable ‘boost’ revenue. So yeah—friends are fine, unless there’s money on the table.
Quick question For Reflection Throughout - Are we comfortable with an algorithmic intermediary, sorting our mail without our guidance or control and with minimal to no feedback to us on what it’s doing?
Ever Checked Your Spam Filter? There’s some weird stuff in there, but every now and again you’ll find a gem. What if you checked your Spam Filter after it sorted all of your mail for virality or so-called engagement potential? I may want all of my dad’s email to go to spam, but even I couldn’t do that to them.
—
The Data. So, the best we can do is a little forensic work. We’ll use my last four articles of my The Second Act with RiF newsletter. It’s riveting, it’s a nail biter and it’s also my life.
The Data (4 Articles Published)
Article Views: ~130 per post; except ~310 on Article 2
Member Reach: 2.3k (article 1), 8.5k (article 2), 360 (article 3) and 360 (article 4)
CTR (Views / Reach): ~2-4% (Articles 1&2); ~35–40% (Articles 3&4)
Public Engagement: High on 1 & 2, lower on 3 & 4
Total Connections: ~700; Second Act with RiF Total Subscribers: ~115
So, What Happened?:
First, I don’t know and neither do you. You may have theories and I do too, BUT for some reason LI sent article 1 to 2,300+ members, article 2 to 8,500+ members, article 3 to 360 members and article 4 to 360 members.
Here’s my best guess. I had a honeymoon (sizing up) period where the algorithm went crazy to see how far I could go. 8,500+ people on article 2?! I can assure you that I do not know who they all are, but I do wish them well. LI was so diligent, your dog might have even seen my second article. That said, the CTR went all the way down to 2%. Now even performative spammers were beating me. But, my public engagement was really high. So . . . LI kept rolling? Engagement wins over CTR? How much and when? Actual usable data for real humans? Nah.
On articles 3 and 4, maybe this is a pattern. High CTR% (35-40) with the same member reach for both articles (your dog did NOT get these articles). However, much lower engagement. C’mon guys —you’ve all got thumbs! Does that mean lower reach? Once again, actual usable data for real humans? Nah.
Maybe send my posts to all of my connections? My newsletter to all of my subscribers? Crazy talk.
Where does this leave us? This may not be a Good Will Hunting situation, but its a pile of numbers with no sense of mechanism. There’s no visibility into what was tried, what wasn’t, and what’s driving performance. I don’t even know if the information is accurate — or how they defined key terms. I get information that I can mix and match to create very different narratives.
Possible Narratives -
Engagement Rules. Engagement is the thing. Article 1 and 2 were great, 3 and 4 were fine. Why? For 3 and 4, not enough people showed up in the comments and jumped up and down. On 1 and 2, the engagement was so good that we’re show your stuff to anyone. Like a populace that was 98% not interested.
No, Content is King. LI likes quick actionable content. Preferably that which has a tight CTA and an infographic. It’s no Tik Tok but it ain’t Substack or Medium either. Perhaps a 5+ minute read is just too much? Notice the length of this piece?
The tools don’t explain the system. And that’s the problem.
The data above is what you get. All you get. On the receiving end you get even less. No knowledge of why you are getting what you are getting and minimal tools to allow you to sculpt it the way you want. On the creator side, this is a bit more dire as this is someone’s profession.
Sure, we could all “get good” and beat the system. Maybe. But, why do we have to be forensic scientists to know if an article was sent to a subscriber or to one’s first level connection?
So What Should Happen?
Forget reach. Let’s just focus on basic platform trust:
If I send a newsletter to a subscriber, tell me it was delivered.
If I have connections, tell me you showed my posts to them.
If for some reason you decide it’s not appropriate to deliver something to a subscriber or a connection (opt outs or spam), tell me why.
If a post “underperforms” or “overperforms” tell me what that means and how I can determine that with a given metric.
If you want to give me reach, great let me know how and why you decided to extend my reach.
Don’t make either recipients or creators guess WHETHER or WHY something was delivered or received.
Your Answers?
I’ll admit that I am a social media luddite and I am late to the game (yes, game) of using social media. I don’t know if I would have jumped back in if it weren’t for my music project. Even then I am dubious. Also, I am NOT the expert and I may be VERY naive, but I’d love to hear your thoughts. We have the best of the best on here and I want to know if I am missing anything because LinkedIn is actually better than I’ve encountered so far. Even so, it is FAR from what I would deem an acceptable filter of information in a social network - especially one geared for professionals.
Are we cool with this?
Note - let’s calibrate clarity first and then decide on what level of outrage is on the table. The Playing Field is up next week. More social media firms are on the hot seat.