Tuesday, January 31, 2017

Played a @pgatour @phoenixopen Tonight!


Played a @pgatour @phoenixopen Tonight!

Played a @pgatour @phoenixopen Tonight!


Played a @pgatour @phoenixopen Tonight!

So excited to be joining @thetimmcgraw @faithhill


So excited to be joining @thetimmcgraw @faithhill #soul2soul!

🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧


🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧

Sunshine On The Line


Sunshine On The Line

From last night @theryman with @charles_esten @cmt


From last night @theryman with @charles_esten @cmt #nashville

So excited to be joining @thetimmcgraw @faithhill


So excited to be joining @thetimmcgraw @faithhill #soul2soul!

🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧


🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧🇬🇧

Sunshine On The Line


Sunshine On The Line

From last night @theryman with @charles_esten @cmt


From last night @theryman with @charles_esten @cmt #nashville

Google Search Console Reliability: Webmaster Tools on Trial

Posted by rjonesx.

There are a handful of data sources relied upon by nearly every search engine optimizer. Google Search Console (formerly Google Webmaster Tools) has perhaps become the most ubiquitous. There are simply some things you can do with GSC, like disavowing links, that cannot be accomplished anywhere else, so we are in some ways forced to rely upon it. But, like all sources of knowledge, we must put it to the test to determine its trustworthiness — can we stake our craft on its recommendations? Let's see if we can pull back the curtain on GSC data and determine, once and for all, how skeptical we should be of the data it provides.

Testing data sources

Before we dive in, I think it is worth having a quick discussion about how we might address this problem. There are basically two concepts that I want to introduce for the sake of this analysis: internal validity and external validity.

Internal validity refers to whether the data accurately represents what Google knows about your site.

External validity refers to whether the data accurately represents the web.

These two concepts are extremely important for our discussion. Depending upon the problem we are addressing as SEOs, we may care more about one or another. For example, let's assume that page speed was an incredibly important ranking factor and we wanted to help a customer. We would likely be concerned with the internal validity of GSC's "time spent downloading a page" metric because, regardless of what happens to a real user, if Google thinks the page is slow, we will lose rankings. We would rely on this metric insofar as we were confident it represented what Google believes about the customer's site. On the other hand, if we are trying to prevent Google from finding bad links, we would be concerned about the external validity of the "links to your site" section because, while Google might already know about some bad links, we want to make sure there aren't any others that Google could stumble upon. Thus, depending on how well GSC's sample links comprehensively describe the links across the web, we might reject that metric and use a combination of other sources (like Open Site Explorer, Majestic, and Ahrefs) which will give us greater coverage.

The point of this exercise is simply to say that we can judge GSC's data from multiple perspectives, and it is important to tease these out so we know when it is reasonable to rely upon GSC.

GSC Section 1: HTML Improvements

Of the many useful features in GSC, Google provides a list of some common HTML errors it discovered in the course of crawling your site. This section, located at Search Appearance > HTML Improvements, lists off several potential errors including Duplicate Titles, Duplicate Descriptions, and other actionable recommendations. Fortunately, this first example gives us an opportunity to outline methods for testing both the internal and external validity of the data. As you can see in the screenshot below, GSC has found duplicate meta descriptions because a website has case insensitive URLs and no canonical tag or redirect to fix it. Essentially, you can reach the page from either /Page.aspx or /page.aspx, and this is apparent as Googlebot had found the URL both with and without capitalization. Let's test Google's recommendation to see if it is externally and internally valid.

External Validity: In this case, the external validity is simply whether the data accurately reflects pages as they appear on the Internet. As one can imagine, the list of HTML improvements can be woefully out of date dependent upon the crawl rate of your site. In this case, the site had previously repaired the issue with a 301 redirect.

This really isn't terribly surprising. Google shouldn't be expected to update this section of GSC every time you apply a correction to your website. However, it does illustrate a common problem with GSC. Many of the issues GSC alerts you to may have already been fixed by you or your web developer. I don't think this is a fault with GSC by any stretch of the imagination, just a limitation that can only be addressed by more frequent, deliberate crawls like Moz Pro's Crawl Audit or a standalone tool like Screaming Frog.

Internal Validity: This is where things start to get interesting. While it is unsurprising that Google doesn't crawl your site so frequently as to capture updates to your site in real-time, it is reasonable to expect that what Google has crawled would be reflected accurately in GSC. This doesn't appear to be the case.

By executing an info:http://concerning-url query in Google with upper-case letters, we can determine some information about what Google knows about the URL. Google returns results for the lower-case version of the URL! This indicates that Google both knows about the 301 redirect correcting the problem and has corrected it in their search index. As you can imagine, this presents us with quite a problem. HTML Improvement recommendations in GSC not only may not reflect changes you made to your site, it might not even reflect corrections Google is already aware of. Given this difference, it almost always makes sense to crawl your site for these types of issues in addition to using GSC.

GSC Section 2: Index Status

The next metric we are going to tackle is Google's Index Status, which is supposed to provide you with an accurate number of pages Google has indexed from your site. This section is located at Google Index > Index Status. This particular metric can only be tested for internal validity since it is specifically providing us with information about Google itself. There are a couple of ways we could address this...

  1. We could compare the number provided in GSC to site: commands
  2. We could compare the number provided in GSC to the number of internal links to the homepage in the internal links section (assuming 1 link to homepage from every page on the site)

We opted for both. The biggest problem with this particular metric is being certain what it is measuring. Because GSC allows you to authorize the http, https, www, and non-www version of your site independently, it can be confusing as to what is included in the Index Status metric.

We found that when carefully applied to ensure no crossover of varying types (https vs http, www vs non-www), the Index Status metric seemed to be quite well correlated with the site:site.com query in Google, especially on smaller sites. The larger the site, the more fluctuation we saw in these numbers, but this could be accounted for by approximations performed by the site: command.

We found the link count method to be difficult to use, though. Consider the graphic above. The site in question has 1,587 pages indexed according to GSC, but the home page to that site has 7,080 internal links. This seems highly unrealistic, as we were unable to find a single page, much less the majority of pages, with 4 or more links back to the home page. However, given the consistency with the site: command and GSC's Index Status, I believe this is more of a problem with the way internal links are represented than with the Index Status metric.

I think it is safe to conclude that the Index Status metric is probably the most reliable one available to us in regards to the number of pages actually included in Google's index.

GSC Section 3: Internal Links

The Internal Links section found under Search Traffic > Internal Links seems to be rarely used, but can be quite insightful. If External Links tells Google what others think is important on your site, then Internal Links tell Google what you think is important on your site. This section once again serves as a useful example of knowing the difference between what Google believes about your site and what is actually true of your site.

Testing this metric was fairly straightforward. We took the internal links numbers provided by GSC and compared them to full site crawls. We could then determine whether Google's crawl was fairly representative of the actual site.

Generally speaking, the two were modestly correlated with some fairly significant deviation. As an SEO, I find this incredibly important. Google does not start at your home page and crawl your site in the same way that your standard site crawlers do (like the one included in Moz Pro). Googlebot approaches your site via a combination of external links, internal links, sitemaps, redirects, etc. that can give a very different picture. In fact, we found several examples where a full site crawl unearthed hundreds of internal links that Googlebot had missed. Navigational pages, like category pages in the blog, were crawled less frequently, so certain pages didn't accumulate nearly as many links in GSC as one would have expected having looked only at a traditional crawl.

As search marketers, in this case we must be concerned with internal validity, or what Google believes about our site. I highly recommend comparing Google's numbers to your own site crawl to determine if there is important content which Google determines you have ignored in your internal linking.

GSC Section 4: Links to Your Site

Link data is always one of the most sought-after metrics in our industry, and rightly so. External links continue to be the strongest predictive factor for rankings and Google has admitted as much time and time again. So how does GSC's link data measure up?

In this analysis, we compared the links presented to us by GSC to those presented by Ahrefs, Majestic, and Moz for whether those links are still live. To be fair to GSC, which provides only a sampling of links, we only used sites that had fewer than 1,000 total backlinks, increasing the likelihood that we get a full picture (or at least close to it) from GSC. The results are startling. GSC's lists, both "sample links" and "latest links," were the lowest-performing in terms of "live links" for every site we tested, never once beating out Moz, Majestic, or Ahrefs.

I do want to be clear and upfront about Moz's performance in this particular test. Because Moz has a smaller total index, it is likely we only surface higher-quality, long-lasting links. Our out-performing Majestic and Ahrefs by just a couple of percentage points is likely a side effect of index size and not reflective of a substantial difference. However, the several percentage points which separate GSC from all 3 link indexes cannot be ignored. In terms of external validity — that is to say, how well this data reflects what is actually happening on the web — GSC is out-performed by third-party indexes.

But what about internal validity? Does GSC give us a fresh look at Google's actual backlink index? It does appear that the two are consistent insofar as rarely reporting links that Google is already aware are no longer in the index. We randomly selected hundreds of URLs which were "no longer found" according to our test to determine if Googlebot still had old versions cached and, uniformly, that was the case. While we can't be certain that it shows a complete set of Google's link index relative to your site, we can be confident that Google tends to show only results that are in accord with their latest data.

GSC Section 5: Search Analytics

Search Analytics is probably the most important and heavily utilized feature within Google Search Console, as it gives us some insight into the data lost with Google's "Not Provided" updates to Google Analytics. Many have rightfully questioned the accuracy of the data, so we decided to take a closer look.

Experimental analysis

The Search Analytics section gave us a unique opportunity to utilize an experimental design to determine the reliability of the data. Unlike some of the other metrics we tested, we could control reality by delivering clicks under certain circumstances to individual pages on a site. We developed a study that worked something like this:

  1. Create a series of nonsensical text pages.
  2. Link to them from internal sources to encourage indexation.
  3. Use volunteers to perform searches for the nonsensical terms, which inevitably reveal the exact-match nonsensical content we created.
  4. Vary the circumstances under which those volunteers search to determine if GSC tracks clicks and impressions only in certain environments.
  5. Use volunteers to click on those results.
  6. Record their actions.
  7. Compare to the data provided by GSC.

We decided to check 5 different environments for their reliability:

  1. User performs search logged into Google in Chrome
  2. User performs search logged out, incognito in Chrome
  3. User performs search from mobile
  4. User performs search logged out in Firefox
  5. User performs the same search 5 times over the course of a day

We hoped these variants would answer specific questions about the methods Google used to collect data for GSC. We were sorely and uniformly disappointed.

Experimental results

Method Delivered GSC Impressions GSC Clicks
Logged In Chrome 11 0 0
Incognito 11 0 0
Mobile 11 0 0
Logged Out Firefox 11 0 0
5 Searches Each 40 2 0

GSC recorded only 2 impressions out of 84, and absolutely 0 clicks. Given these results, I was immediately concerned about the experimental design. Perhaps Google wasn't recording data for these pages? Perhaps we didn't hit a minimum number necessary for recording data, only barely eclipsing that in the last study of 5 searches per person?

Unfortunately, neither of those explanations made much sense. In fact, several of the test pages picked up impressions by the hundreds for bizarre, low-ranking keywords that just happened to occur at random in the nonsensical tests. Moreover, many pages on the site recorded very low impressions and clicks, and when compared with Google Analytics data, did indeed have very few clicks. It is quite evident that GSC cannot be relied upon, regardless of user circumstance, for lightly searched terms. It is, by this account, not externally valid — that is to say, impressions and clicks in GSC do not reliably reflect impressions and clicks performed on Google.

As you can imagine, I was not satisfied with this result. Perhaps the experimental design had some unforeseen limitations which a standard comparative analysis would uncover.

Comparative analysis

The next step I undertook was comparing GSC data to other sources to see if we could find some relationship between the data presented and secondary measurements which might shed light on why the initial GSC experiment had reflected so poorly on the quality of data. The most straightforward comparison was that of GSC to Google Analytics. In theory, GSC's reporting of clicks should mirror Google Analytics's recording of organic clicks from Google, if not identically, at least proportionally. Because of concerns related to the scale of the experimental project, I decided to first try a set of larger sites.

Unfortunately, the results were wildly different. The first example site received around 6,000 clicks per day from Google Organic Search according to GA. Dozens of pages with hundreds of organic clicks per month, according to GA, received 0 clicks according to GSC. But, in this case, I was able to uncover a culprit, and it has to do with the way clicks are tracked.

GSC tracks a click based on the URL in the search results (let's say you click on /pageA.html). However, let's assume that /pageA.html redirects to /pagea.html because you were smart and decided to fix the casing issue discussed at the top of the page. If Googlebot hasn't picked up that fix, then Google Search will still have the old URL, but the click will be recorded in Google Analytics on the corrected URL, since that is the page where GA's code fires. It just so happened that enough cleanup had taken place recently on the first site I tested that GA and GSC had a correlation coefficient of just .52!

So, I went in search of other properties that might provide a clearer picture. After analyzing several properties without similar problems as the first, we identified a range of approximately .94 to .99 correlation between GSC and Google Analytics reporting on organic landing pages. This seems pretty strong.

Finally, we did one more type of comparative analytics to determine the trustworthiness of GSC's ranking data. In general, the number of clicks received by a site should be a function of the number of impressions it received and at what position in the SERP. While this is obviously an incomplete view of all the factors, it seems fair to say that we could compare the quality of two ranking sets if we know the number of impressions and the number of clicks. In theory, the rank tracking method which better predicts the clicks given the impressions is the better of the two.

Call me unsurprised, but this wasn't even close. Standard rank tracking methods performed far better at predicting the actual number of clicks than the rank as presented in Google Search Console. We know that GSC's rank data is an average position which almost certainly presents a false picture. There are many scenarios where this is true, but let me just explain one. Imagine you add new content and your keyword starts at position 80, then moves to 70, then 60, and eventually to #1. Now, imagine you create a different piece of content and it sits at position 40, never wavering. GSC will report both as having an average position of 40. The first, though, will receive considerable traffic for the time that it is in position 1, and the latter will never receive any. GSC's averaging method based on impression data obscures the underlying features too much to provide relevant projections. Until something changes explicitly in Google's method for collecting rank data for GSC, it will not be sufficient for getting at the truth of your site's current position.

Reconciliation

So, how do we reconcile the experimental results with the comparative results, both the positives and negatives of GSC Search Analytics? Well, I think there are a couple of clear takeaways.

  1. Impression data is misleading at best, and simply false at worst: We can be certain that all impressions are not captured and are not accurately reflected in the GSC data.
  2. Click data is proportionally accurate: Clicks can be trusted as a proportional metric (ie: correlates with reality) but not as a specific data point.
  3. Click data is useful for telling you what URLs rank, but not what pages they actually land on.

Understanding this reconciliation can be quite valuable. For example, if you find your click data in GSC is not proportional to your Google Analytics data, there is a high probability that your site is utilizing redirects in a way that Googlebot has not yet discovered or applied. This could be indicative of an underlying problem which needs to be addressed.

Final thoughts

Google Search Console provides a great deal of invaluable data which smart webmasters rely upon to make data-driven marketing decisions. However, we should remain skeptical of this data, like any data source, and continue to test it for both internal and external validity. We should also pay careful attention to the appropriate manners in which we use the data, so as not to draw conclusions that are unsafe or unreliable where the data is weak. Perhaps most importantly: verify, verify, verify. If you have the means, use different tools and services to verify the data you find in Google Search Console, ensuring you and your team are working with reliable data. Also, there are lots of folks to thank here -Michael Cottam, Everett Sizemore, Marshall Simmonds, David Sottimano, Britney Muller, Rand Fishkin, Dr. Pete and so many more. If I forgot you, let me know!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, January 30, 2017

Love this guy. Doing an event @theryman tonight . @charles_esten


Love this guy. Doing an event @theryman tonight . @charles_esten #nashville

Incase you needed something really weird in your day! @scottyonair


Incase you needed something really weird in your day! @scottyonair

Love this guy. Doing an event @theryman tonight . @charles_esten


Love this guy. Doing an event @theryman tonight . @charles_esten #nashville

Incase you needed something really weird in your day! @scottyonair


Incase you needed something really weird in your day! @scottyonair

Little piano in the show @surfballroom


Little piano in the show @surfballroom

Little piano in the show @surfballroom


Little piano in the show @surfballroom

Excited to take over as guest host with @gocountry105 starting today at 10am, ri...


Excited to take over as guest host with @gocountry105 starting today at 10am, right after #GrahamInTheMornings

Excited to take over as guest host with @gocountry105 starting today at 10am, ri...


Excited to take over as guest host with @gocountry105 starting today at 10am, right after #GrahamInTheMornings

Why You Should Steal My Daughter's Playbook for Effective Email Outreach

Posted by ronell-smith

During the holidays, my youngest daughter apparently had cabin fever after being in the house for a couple of days. While exiting the bedroom, my wife found the note below on the floor, after the former had slyly slid it under the door.

Though tired and not really feeling like leaving the house, we had to reward the youngster for her industriousness. And her charm.

Her effective "outreach" earned plaudits from my wife.

"At least she did it the right way," she remarked. "She cleaned her room, washed dishes, and read books all day, obviously part of an attempt to make it hard for us to say no. After all she did, though, she earned it."

Hmm...

She earned it.

Can you say as much about your outreach?

We're missing out on a great opportunity

Over the last few months, I've been thinking a lot about outreach, specifically email outreach.

It initially got my attention because I see it so badly handled, even by savvy marketers.

But I didn't fully appreciate the significance of the problem until I started thinking about the resulting impact of bad outreach, particularly since it remains one of the best, most viable means of garnering attention, traffic, and links to our websites.

What I see most commonly is a disregard of the needs of the person on the other end of the email.

Too often, it's all about the "heavy ask" as opposed to the warm touch.

  • Heavy ask: "Hi Ronell ... We haven't met. ... Could you share my article?"
  • Warm touch: "Hi Ronell ... I enjoyed your Moz post. ... We're employing similar tactics at my brand."

That's it.

You're likely saying to yourself, "But Ronell, the second person didn't get anything in return."

I beg to differ. The first person likely lost me, or whomever else they reach out to to using similar tactics; the second person will remain on my radar.

Outreach is too important to be left to chance or poor etiquette. A few tweaks here and there can help our teams perform optimally.

#1: Build rapport: Be there in a personal way before you need them

The first no-no of effective outreach comes right out of PR 101: Don't let the first time I learn of you or your brand be when you need me. If the brand you hope to attain a link from is worthwhile, you should be on their radar well in advance of the ask.

Do your research to find out who the relevant parties are at the brand, then make it your business to learn about them, via social media and any shared contacts you might have.

Then reach out to them... to say hello. Nothing more.

This isn't the time to ask for anything. You're simply making yourself known, relevant, and potentially valuable down the road.

Notice how, in the example below, the person emailing me NEVER asks for anything?

The author did three things that played big. She...

  • Mentioned my work, which means she'd done her homework
  • Highlighted work she'd done to create a post
  • Didn't assume I would be interested in her content (we'll discuss in greater detail below)

Hiring managers like to say, "A person should never be surprised at getting fired," meaning they should have some prior warning.

Similarly, for outreach to be most effective, the person you're asking for a link from should know of you/your brand in advance.

Bonuses: Always, always, always use "thank you" instead of "thanks." The former is far more thoughtful and sincere, while the latter can seem too casual and unfeeling.

#2: Be brief, be bold, be gone

One of my favorite lines from the Greek tragedy Antigone, by Sophocles, is "Tell me briefly — not in some lengthy speech."

If your pitch is more than three paragraphs, go back to the drawing board.

You're trying to pique their interest, to give them enough to comfortably go on, not bore them with every detail.

The best outreach messages steal a page from the PR playbook:

  • They respect the person's time
  • They show a knowledge of the person's brand, content, and interests with regard to coverage
  • They make the person's job easier (i.e., something the person would deem useful but not necessarily easily accessible)

We must do the same.

  • Be brief in highlighting the usefulness of what you offer and how it helps them in some meaningful way
  • Be bold in declaring your willingness to help their brand as much as your own
  • Be gone by exiting without spilling every single needless detail

Bonus: Be personal by using the person's name at least once in the text since it fosters a greatest level of personalization and thoughtfulness (most people enjoy hearing their names):

"I read your blog frequently, Jennifer."

#3: Understand that it's not about you

During my time as a newspaper business reporter and book reviewer, nothing irked me more than having people assume that because they valued what their brand offered, I must feel the same way.

They were wrong 99 percent of the time.

Outreach in our industry is rife with this if-it's-good-for-me-it's-good-for-you logic.

Instead of approaching a potential link opportunity from the perspective of "How do I influence this party to grant me a link," a better approach is to consider "What's obviously in it for them?"

(I emphasize "obviously" because we often pretend the benefit is obvious when it's typically anything but.)

Step back and consider all the things that'll be in play as they consider a link from you:

  • Relationship - Do they they know you/know of you?
  • Brand - Is your brand reputable?
  • Content - Does your company create and share quality content?
  • This content - Is the content you're hoping for a link for high-quality and relevant?

In the best case scenario, you should pass this test with flying colors. But at the very least you should be able tp successfully counter any of these potential objections.

#4: Don't assume anything

Things never go well when an outreach email begins "I knew you'd be interested in this."

Odds suggest you aren't prescient, which can only mean you're wrong.

What's more, if you did know I was interested in it, I should not be learning about the content after it was created. You should involved me from the beginning.

Therefore, instead of assuming I'll find your content valuable, ensure that you're correct by enlisting their help during the content creation process:

  • Topic - Find out what they're working on or see as the biggest issues that deserve attention
  • Contribution - Ask if they'd like to be part of the content you create
  • Ask about others - Enlist their help to find other potential influencers for your content. Doing so gives your content and your outreach legs (we discuss in greater detail below)

#5: Build a network

Michael Michalowicz, via his 2012 book The Pumpkin Plan, shared an outreach tactic I've been using for years in my own work. Instead of asking customers to recommend other customers for a computer service company he formerly owned, he asked his customers to recommend other non-competing vendors.


Genius!

Whereas a customer is likely to recommend another customer or two, a vendor is likely able to recommend many dozens of customers who could use his service.

This is instructive for outreach.

Rather than asking the person you're outreaching to for recommendations of other marketers who could be involved in the project, a better idea might be to ask them "What are some of the other publications or blogs you've worked with?"

You could then conduct a site search, peruse the content the former has been a part of, then use this information as a future guide for the types and quality of content you should be producing to get on the radar for influencers and brands.

After all, for outreach to be sustainable and most effective, it must be scalable in an easy-to-replicate (by the internal team, at least) fashion.

Bonus: Optimally, your outreach should not be scalable — for anyone but you/your team. That is, it's best to have a unique-to-your-brand process that's tough to achieve or acquire, which means it's far less likely others will know about, copy or use it or one like it.

Awaken your inner child, er, PR person

Elements of the five tips shared above have been, singularly, on my mind for the better part of two years. However, they only coalesced after I read the note my daughter shared, primarily because her message delivered on each point so effectively.

She didn't wait until she needed something to get on our radar; never over-sold the message; was selfless in realizing we all likely needed to get out the house; didn't assume we were on the same page; and activated her network by first sharing the note with her sister first, and, through her mom, me.

Now, the question we must all ask ourselves is if the methods we employ as effective?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Sunday, January 29, 2017

My wife's twin sister just moved to town. She has a boyfriend. So if you see my ...


My wife's twin sister just moved to town. She has a boyfriend. So if you see my 'wife' kissing another dude, don't freak out 🤣 #twinning

A few new shiners.


A few new shiners.

My wife's twin sister just moved to town. She has a boyfriend. So if you see my ...


My wife's twin sister just moved to town. She has a boyfriend. So if you see my 'wife' kissing another dude, don't freak out 🤣 #twinning

A few new shiners.


A few new shiners.

Idk


Idk

Organization!


Organization! #gearybusey #guitarnerd

Idk


Idk

Organization!


Organization! #gearybusey #guitarnerd

Saturday, January 28, 2017

Wow Wisconsin you made me dance! Thanks for bringing it Tonight


Wow Wisconsin you made me dance! Thanks for bringing it Tonight

New whip.


New whip. #nebraska

About to play the Rave @chalmerscroftw @njgehri


About to play the Rave @chalmerscroftw @njgehri

He loves the dressing room lava lamp at venue tonight


He loves the dressing room lava lamp at venue tonight .#therave

Looking good @surfballroom


Looking good @surfballroom

Country Music Definition & Overview

Willie NelsonIntroduction to Country Music Country music is one of the popular genres for ages,

#highwayvagabond


#highwayvagabond

You sound so good iowa.


You sound so good iowa. #holdinher

Thursday, January 26, 2017

Last night I sat on my dads lap and sang


Last night I sat on my dads lap and sang #freefalling! #fullgrownkid

My reaction to hearing the way my new song turned out!!!! Holy smokes 🎉 Can't wa...


My reaction to hearing the way my new song turned out!!!! Holy smokes 🎉 Can't wait til you guys hear this! Who's ready for new music??