I am pleased to introduce Justin Rondeau, Conversion Optimization Manager of Digital Marketer, to the show today. Justin specializes in hands-on testing and conversion optimization in both B2B and B2C environments.
Justin came to Digital Marketer from WhichTestWon, a site that trains its audience in conversion optimization by sharing real test results. Over the course of his career, he has analyzed over 2,500 tests across virtually every industry and has a wealth of knowledge to share when it comes to conversion.
I invited Justin to talk about some of the not-so-obvious ways of increasing conversion rates that he has developed over the course of his career. He also shares some surprising test results that might shake your current beliefs about conversion rate optimization techniques. Listen in to learn some game-changing tips, dos and don’ts of CRO, how long you should run tests before you can be confident in your results and much more! Grab a notebook or an iPad and get ready to take some notes…
If you enjoy the show, please drop by iTunes and leave a review while you are still feeling the love! Reviews help others discover this podcast, and I greatly appreciate them!
Listen in:
Podcast: Play in new window | Download
On Today’s Episode We’ll Learn:
- The importance of A/B testing your offerings.
- Justin’s example of a social proof test that did not work out as planned.
- Why social proof might not be the “silver bullet” for conversions.
- How improperly used trust seals and assurance messages may actually hurt conversions.
- Justin’s tips for using images of people and faces on websites.
- The updates to industry standards in regards to how long one should run conversion tests
- Great tools website operators can start using today to grow their testing skills.
Key Resources:
- Connect with Justin: Twitter | Linkedin
- DigitalMarketer.com
- WhichTestWon
- Learn more about Cialdini’s Six Principles of Influence
- DIY Themes
- Contentverve.com
- Visual Website Optimizer – vwo.com
- Optimizely
Share the Love:
If you like The Brainfluence Podcast…
Never miss an episode by subscribing via iTunes, Stitcher or by RSS
Help improve the show by Leaving a Rating & Review in iTunes (Here’s How)
Join the discussion for this episode in the comments section below
Full Episode Transcript:
Welcome to the Brainfluence Podcast with Roger Dooley, author, speaker and educator on neuromarketing and the psychology of persuasion. Every week, we talk with thought leaders that will help you improve your influence with factual evidence and concrete research. Introducing your host, Roger Dooley.
Roger Dooley: Welcome to the Brainfluence podcast, this is Roger Dooley. My guest this week is conversion expert Justin Rondeau. I first met Justin in Frankfurt. We were both speaking there at Andre Morrissey’s excellent Conversion Summit. At the time, Justin was chief evangelist at which test one. A site most Neuromarketing readers will be familiar with because I often use their published test results as illustrations of some of the points I make. Now, Justin is conversion optimization manager at Digital Marketer, a firm founded and run by Ryan Deiss. Justin has personalized analyzed over 2,500 tests and today he’s going to share some of the not so obvious ways to increase conversion that he’s found. Welcome to the show Justin.
Justin Rondeau: Hi Roger, thanks for having me today.
Roger Dooley: Yeah, and by the way, I might add Justin, and also listeners, I still sound a bit funny. I am recovering from a bout of laryngitis a couple of days ago and hopefully we will not subject either you Justin or the audience to this again.
First, let me say it’s great to have you on the show, Justin. The speaker group that Andre Morris put together was really an all-star cast. I have had a few others from that group on the show already, Natalie Nahi, Pat Palaya, Michael A., are coming to mind and I’m probably missing some too. It was really a great event.
In any case, now you are at Digital Marketer and I saw a statistic that they spent 15 million testing and run thousands of tests. I’ve read some of Ryan Deiss’s content too and it’s clear he’s a real believer in the value of tests. Right?
Justin Rondeau: Yeah, definitely. I think really what it comes down to is we look at a more holistic approach to our testing methodology. Where we’re evaluating our funnel builds. If you look at any kind of product we launch over there at Digital Marketer, there kind of these longer scale funnels that are trying to bring people from, you know cold traffic and turn them into buyers. Which is actually a pretty difficult thing to do so we’re running tests on the acquisition side of things for top funnel lead generation, all the way down to our higher ticket products. I mean that can be upwards of ten to twenty-five thousand dollars. We’re kind of running the gambit there.
Roger Dooley: Well, I think that once you start testing and see the difference that it can make and also the surprises that come out of it, it makes you a believer. I think probably most people that aren’t fans of testing really haven’t done it.
Justin Rondeau: Yeah, I think that would be the case. There is a lot of opposition to testing. I think this was more so the case about five years ago when, before testing got its like kind of big surge because people had been seeing results from you know, different case studies, from testing tech sites like which test won as well as things that we publish on the Digital Marketer blog. We are starting to give a real case for why you should be investing in this because quite frankly, people really don’t know what they’re doing and it’s a big ego shot. I think that hurts some of the higher C level people when they think they know everything about their product and their market when it’s actually we know very little because we’re not our own customer.
Roger Dooley: Right and for all the people who think they have the golden gut and can intuitively tell what the customers are going to react to. Chances are, a few weeks of test results would show the err of their ways. Maybe some people don’t like that. I know on my speeches I always offer a lot of practical advice to the audience on ways they can work with their visitors brain on ways to improve their website results but I don’t tell them do this, I tell them test this because even though some thing is valid psychology, even though its worked in other situations it doesn’t always work in your particular situation. Testing is just so essential.
Justin Rondeau: Yeah, you’re absolutely right. One of my main missions for this year is to get people to stop running kind of frivolous tests, and to get them to run tests properly. I think that you made a great … And also to be testing as well. Just to go by your gut and think that, “Oh, this worked in such and such industry.” Or, “My competitor is running this so it must work for me.” Even within your own markets, your customers are going to act differently than they do on a competitor’s site. Based on maybe different knowledge of the brand, their affinity with it.
You really do have to test somethings out if you’re not sure how it’s going to impact the bottom line. I mean, there’s things you can test, and there’s things you can’t test as well. I think that there’s things that are just broken on a website. Where I think you’d agree with me Roger, that if something is broken or clearly a conversion hurdle, say for instance you have a thank you page that has no offer on it. It’s just a thank you, we’ll email you. No follow up information. You’re clearly leaving money on the table.
It’s not a matter of testing whether you should have something on that thank you page, it’s about testing and honing on the offer that you’re putting on that page. There’s definitely a way to be thinking about this. When you’re considering when you should be A/B testing and what to A/B test.
Roger Dooley: Right. I think testing the right stuff as you say, is important too. It’s not testing the orange button versus the red button versus the green button. It may be some fundamental usability feature of the site that’s going to have a much bigger impact on conversion. Even though somehow it’s appealing to say, “Hey, orange converts at 10% better than green does. So let’s go with that.”
Justin Rondeau: Yeah.
Roger Dooley: I love reading your posts, because they’re all so well informed with data, Justin. One you wrote a while ago was about A/B tests that surprised experts. I looked at the several case studies in there and I would have gotten, I think, every one of those wrong. Just as the majority of experts would, because they fly in the face of what people expect will work. The first one was about social proof and we all know that social proof is one of Cialdini’s six principles. We know it’s been proven time and time again to be effective psychology when it comes to persuading people to do stuff by showing them what other people are doing. Or lots of other people are doing. Tell us about that example of social proof that did not work out as planned.
Justin Rondeau: Yeah, this is actually a fairly classic test that I think people have seen. It was done on DIY themes. It was run by Derick Halpern actually. As the test went, there was some social proof added there on a … It was a secondary call to action because it was for DIY themes is buying their theme on the right hand rail, there was a “get newsletter” sign up. So, get email updates, it’s free. Just blank slate right there, put your email in and join.
Then they had another variation that had, “Join 14,752”, which was set dynamically, so that it would reload the new number as people signed up. So that wasn’t just a lie of a number. They put that in there thinking, “Yeah, of course this is going to increase opt ins.” Social proof is one of the best ways to get people to involved, but they actually noticed … They found after they ran this test, and thankfully they ran this test, the one without the social proof saw 122% more newsletter opt ins.
I’ve presented this test at several different events that I’ve spoken at. I always do the raise your hand if you think it’s this version or this one. Every single time, unless someone’s seem me speak before where I’ve shown this example, the entire audience gets it wrong. Because they just assume social proof is kind of a silver bullet for conversions and it’s not. This test itself also, people would be saying, “Well, this might be an outlier example.”
I’ve seen this on several other sites. I think one of them was for a company called LKR Social Media, which is Laura Roeder. Whose created, Meet Edgar now. She ran one on her homepage with a newsletter count that said, “Join X amount of people.” That even tanked conversions there as well.
Roger Dooley: Hmm. Yeah, really interesting stuff. Because if you look at most sites that are run by sophisticated digital marketers, the vast majority do have some kind of social proof of how many subscribers they have for their pitch to subscribe and sell on. I’ll point out too, that this test result doesn’t mean social proof doesn’t work. It just means that you need to test it for your audience, and not assume that because other smart people are doing that you should too.
Justin Rondeau: You made an extremely great point right there. By saying, just because this … We’re showing that this is the reason you should be testing. This is an example. I like to call it, it’s kind of a gadfly example. It’s kind of that pin prick you get to remind yourself that, “Oh there are things that are outside of your own understanding and outside of the social norms.” But really here, we’re not saying, “Don’t use social proof at all.” Just like if social proof worked, we wouldn’t be saying, “Use social proof all the time.”
When you’re testing, don’t just take someone else’s work and their examples and think that it’s going to work for you. This isn’t a universal truth. Anything you see here, when you’re split testing and seeing case studies is not a universal. You really, really have to remember that because that’s where it gets dangerous for your business, if you start thinking that these are universals.
Roger Dooley: Mm-hmm (affirmative). Another example in that same post was, the addition of a trust symbol to a form. Which normally, you would think that adding a trust symbol in this case was a TRUSTe symbol, but there are other trust symbols from I think McAfee and VeriSign folks. All designed to assure people that their information is safe, their payments are safe, or whatever. That there’s a third party that’s looking out for their safety. Indeed the people that sell these services and seals, do have lots of case studies that show that their customers convert better. But, you have an example where, relatively clean test showed adding a trust symbol did not work. Explain about that one.
Justin Rondeau: Yeah. This is actually one of many tests I’ve also seen where the security seal hasn’t done exactly what was promised. In this case, it was a TRUSTe seal was added kind of at the top funnel. Where you’re just giving your information. It was a lead generation form for first name, last name, email, zip, and primary telephone. The brand was actually obscured, because they didn’t want to share that information at that point in time.
What they had seen there was by adding the TRUSTe logo, it really did hurt conversions. The one without it saw 12.6 more completed forms. I think, really this comes down to context. If you’re throwing in trust seals and maybe different privacy policies and all these other things to combat any sort of user friction. You want to make sure that that friction is apparent. Because if you’re throwing a seal too early, people might start asking, “Well, should I trust these people? I was okay with giving them my information but now they’re kind of overdoing it.”
It’s about the right seal for the right type of content. TRUSTe I also feel is much more of something I’d see in an e-commerce scenario, versus just a lead generation piece. I would have been completely content with just a simple privacy policy. You know, “Your information will not be shared. Here’s a link to our privacy policy.” So that you know it. I think at that level, making sure you have that type of information there rather than some hardcore seal, that one, is going to consume a lot of space and two also makes people start thinking that, “Oh, should I be trusting them?”
It’s kind of the classic example of seeing text that says, “This is risk-free.” Then you start thinking, I had no idea there was actually a risk involved. So that implies risk. That’s why I think this one had that surprising result.
Roger Dooley: Right. Kind of reminds me of some other tests that I’ve seen about adding the assurance on an opt in form that, “We won’t spam you.” Including the word spam, in some cases, cuts conversion on the opt in form. Probably because people haven’t been thinking that they might get spammed, but now that you’ve suggested it to them, assuring them that you’re not going to do it, doesn’t overcome the negative aspect and it converts more poorly.
Justin Rondeau: Exactly. Yeah, that was a test run by Michael Aagaard, he’s of ContentVerve, and he put that test together thinking, this is a hip site. It was for a site called betting experts. He’s like, “Oh, yeah. That’s a hip site.” We’ll throw on, we won’t spam you and they were trying to be cutesy with their privacy policies. What I really learned from that example and from some other examples as well, is that people take their privacy very, very seriously. Even in the millennial scenario where it’s people giving up information so readily, they do care about their privacy. If you ever see whenever there’s a Facebook change, the pitchforks that come out from that.
There were also some examples of a test from a college that was trying to get information for their undergrads. They had a variation that had no privacy policy versus a variation that did have a privacy policy and the one that did have a privacy policy helped increase conversions significantly. Privacy is something that people do care about and take seriously. Don’t use things like spam or cutesy stuff in the privacy policy because people then won’t take you seriously and they won’t trust you with their information.
Roger Dooley: Yeah, that’s really interesting, because I would guess probably, maybe a quarter or a third of the opt in forms I see actually use the word spam somehow. In their insurance that you’ve got privacy. Of course, the ironic thing is that very few people actually read the privacy policy, which is often paragraphs and paragraphs of rather dense legalize stuff. But if you’re assured that they have a privacy policy and they see the link, that may be sufficient.
Let’s talk a little bit about faces and humans. Common wisdom says that adding people to your website, and faces, is usually a good thing. Just because it makes you seem a little bit more approachable, more human. What are some test results that you’ve seen on the inclusion of faces and humans that might surprise people?
Justin Rondeau: I think this is one that won’t surprise people, it’s that stick photos are absolutely terrible and you shouldn’t be using them. If you’re using any sort of stock photography on your site, the general likelihood is that people will see right through it and they won’t convert. I think another one is, you have to add faces where it makes sense. I saw once, an example, it was a printing website and they threw on some like corporate stock photo and they actually diminished the size of the product shot, in the left hand rail. When they added it in there, it tanked conversions. They were like, “Oh, we need to show there’s people here and we have a personality.” It showed, one, that their personality was quite bland, and pretty much insincere.
So just adding a photo for photo sake, doesn’t really cut it. HubSpot tried that out too, when they had a series of landing page tests they were running. I believe it was around 8 different variations. I’m thinking of this off the top of my head, too. It was around 8 variations they were running. At this point in time, at HubSpot, they were getting to a case where like every single landing page needs to have a person’s face on it. That was just what was coming down the pipeline, because they’ve seen results that have worked in that’s favor. They’ve read a lot about it. They probably published a lot about it. They’re one of the biggest publishing sites that are out there currently.
Actually, what they found out was adding the face to the page, and it was a genuine face. It was a face of an actual Hub Spotter, really did not help conversions either. The one without the image saw a 24% lift in form completions. These are pretty large forms that people are filling out.
Roger Dooley: Maybe they needed better looking employees.
Justin Rondeau: Maybe. The poor woman. I went off on kind of a tirade.
Roger Dooley: That would really get your ego, if you found out that putting your image on the landing page caused a 30% drop in conversions. You know?
Justin Rondeau: Oh yeah.
Roger Dooley: That’s a result I would not want to see. Although-
Justin Rondeau: I’m sure that would happen with me.
Roger Dooley: You know, actually, on one of my opt in forms I do have my picture. Now I’m afraid to test that against the no picture version.
Justin Rondeau: If you look at HubSpot pages now, whenever you get there. Go to their squeeze pages or their landing pages, there’s no images of people there. They’re showing the product, headline, benefits. Very similar to the squeeze pages we use at Digital Marketer. We keep it very to the point.
I guess the whole face thing is, faces also grab a lot of attention. Yes, it’s great. It gives you a personality, I suppose. But if the face isn’t doing anything to make the page any better, or it’s actually becoming a distraction, where if you do any sort of eye tracking or heat map work, you’ll notice that the face will take up most of the view. So people will be looking there first and that’ll actually distract away from your overall message. Add people and faces where it makes sense. I guess that’s kind of something that works with any type of content. Don’t just add them willy-nilly.
Roger Dooley: Right. And test because even if it makes sense, it may still not improve conversion.
Justin Rondeau: Exactly. There was another example, that was in that post that showed people talking about … It was from Auto Desk, it was for videos showing features and they had two examples of where it was more product centric image icon versus a person talking about the features. The one with the people, these were people that would be talking in the videos, they were like, “Oh, let’s show them.” And it actually worked better to show the kind of more cad like style of imagery in those icons.
Roger Dooley: I guess that reflects what the customers were interested in too. Another product might have customers who are more interested in being helped through the process by humans. As opposed to being thrust into a bunch of technical stuff they didn’t understand. Auto Desk users are probably pretty well out there on the technology scale I would guess.
Justin Rondeau: Yeah. Yeah, I think so.
Roger Dooley: Justin, do you think that stopping a test too soon is a common mistake? The reason I ask is because I almost fell victim to that myself. I was testing a variation on a subscription opt in and after about three days … This is relatively modest traffic site, not ridiculously low but it’s not like Amazon. Where they could run a test for 15 minutes and have a massive amount of data. After a few days, I was seeing a lift of like 70 or 80% and saying, “Wow. This is really good.” I almost said why leave results on the table, I should just ditch the lower performing version. But I didn’t. I figure, okay, I’ll let it run. As time passed, the spread continued to drop and it was sort of a regression to the mean happening, I guess, because even after a couple of weeks, there was still an advantage to the new version. But it was probably more in the 20, 25% range.
Then that begs the question of if you let it run for a year, will it drop to 5%? Without getting into the detailed mathematics of it, how long should somebody run a test before saying, “Okay, this is pretty certain that I’ve got an answer here and I can use the better version with a good degree of confidence.”
Justin Rondeau: Yeah, I have a few rules of thumb. I guess the industry rule of thumb, which actually it’s changed lately, there’s been an increase. Where most people would say you need at least 100 converting activities per variation for you to be confident in your results. In the last year, I’ve seen the conversion experts out there saying it needs to be as much as 150 to 200. I still think 100 is a good baseline for those things. Because when you’re doing these things, you’re not looking at visits. You want to look at actual converting activities. Obviously, the more visits you have hopefully the more conversions you then will have. That’s one of the baselines people look at.
Another thing I like to look at too, is looking at the normalization of the data. In any kind of testing tool, they’ll show you that cool little graph where you’ll see in the first two, three days, major spikes happening because you’ve really upset the status quo. If you’re not excluding returning visitors, or you’re using all of your traffic, especially if you’re modest to moderate traffic site, you’re probably going to be using all your traffic. The first few days you’re going to see some very spiky things happening and then it will begin to level out.
How I like to look at these things, from a scheduling perspective, is I like to get a basic understanding of how much traffic I’d need based on my current conversion rate and estimated conversion lift. I like to run tests that coincide with buying cycles if possible. I don’t like to run tests for more than six weeks. Normally what I do is I just pop them into one of the testing calculators out there, VWO has one that’s pretty much what I use. I used to use some different stats calculator at the time, but that was for only a two variation test and was tough for me to tell people to use to. And it doesn’t work on the new Mac OS, so can’t use it anymore. It tells you to plug in your current traffic rate, your current conversion rate, your estimated lift. This is where people have a hard time, figuring that out. Anytime they have some sort of estimation they have to make. You want to be modest here because you want to give your test enough time to work.
Once you have your estimated lift in there, I’d recommend looking at people’s case studies to see what they have. Don’t expect to have a 250% lift when you’re putting that in there.
Roger Dooley: I was going to say that chances are whatever you expect the lift to be, the actuality is not going to be that good.
Justin Rondeau: Yeah, cut it significantly. With all those inputs, the output will then give you a certain amount of days you need. Then, just to kind of deal with time, because time’s a jerk. Say if you get, you can run this test in 18 days. I always round it up to the nearest week, so I’d run it for 21 days.
Roger Dooley: Mm-hmm (affirmative) so, you bring up the subject of tools and for quite a while I was recommending Optimizely and then a while back they changed to really focus on more of an enterprise market and not so much on the small, medium business market. What tools do you recommend for website operators, that they can at least get started with, without making a massive commitment and then grow their testing skills and maybe move up the food chain after a while. What are some good testing starting points, do you think?
Justin Rondeau: Yeah, well, Optimizely is still an option. I mean, you’re going to get no hand holding. I think they give you up to 50,000 uniques per month, which is ample to do some things. They kind of strip some of the things you can actually do, some of the features. And you’re not going to be able to talk to people about things, if you have a question. So that’s an option, and it’s free.
I would never recommend Google Content Experiments in its current form. I spoke to Justin Cutroni back in October, and he said that one of their main initiatives this year is to really beef that thing up. So I’m excited, but it’s April now and I’m waiting. We use VWO at Digital Marketer. Just what we’ve been using, familiar with their tool. Most of these tools, a lot them you’re going to see, they’re all pretty much the same. Except for maybe a few nuances, essentially, they’re doing the same thing. If you’re doing minor cosmetic changes to things, anything’s going to really work. And if you’re going to stay out of ITs hair.
Roger Dooley: Justin, I’m going to interrupt you there, for a second because I think that’s a good point for those folks who are not currently using any of these tools. I’m sure that many of our listeners are in fact, familiar with these tools, but for those who aren’t, what they allow you to do is create a different version of content on a page. It could be a photo, it could be the opt in form, it could be anything and without actually changing the code on the page, which I’ve worked with some large organizations where to get a code change on a webpage can take weeks and weeks and have to go through quality control and all these various steps and it makes any kind of testing that way impractical. But what these tools do is by inserting one line of java script code one time, you can change out the picture of the human with the a picture of a screen. Or eliminate the picture altogether, or change the headline, body copy, and basically just do it from your desktop. Without actually physically changing the website.
They’re very powerful tools, and fairly easy to use in my experience. Anyway, sorry to interrupt your flow there.
Justin Rondeau: No worries, but that’s a good point for people who didn’t know what these tools were would have no idea what I’m talking about right there. So, thank you for that.
As I was saying, they’re pretty much the same. If you’re just doing cosmetic changes, like Roger was talking about, all these tools are able to do that. It’s really just a matter of personal preference at that point. It’s when you get into some larger scale stuff that would require development work. Or more CSS work or anything like that, that you wouldn’t be able to do, that’s when you’re going to be ringing the buzzer at IT to try and get in queue with them. Or with your developers.
So, cosmetic change wise, there’s no major differences. I know, I haven’t used Optimize Leads stuff in terms of connecting with their API or anything like that but yeah, in terms of tools, if you’re using it for just those types of tests. I really think just go with what your budget says because they’re all pretty much the same.
Roger Dooley: Mm-hmm (affirmative). What’s the future hold for conversion optimization and testing do you think? You see anything interesting and different coming down the pike?
Justin Rondeau: I’ve been saying for a while, I think there’s going to be more suites coming out. It’s almost like, instead of just being a testing tech, like an A/B testing tool … You’ll notice this with what’s being developed at VWO and Optimizely, they’re offering other things like surveys, personalization. Click tracking. More qualitative things. The qualitative data is probably the more underutilized stuff, because I think in the last few years people have been like, “Oh, quantitative data is so much better thank qualitative data.” Really you need to do both of them.
I think anybody in the CRO space and the testing space understands that too. There’s going to be more one stop shops. I remember about two years ago, I was giving a talk at the MarketingProf’s B2B forum in Boston and Monotate was there. I ran into Rob Yoegel, he was working at Monetate at the time and I asked him, “Hey, I said you guys were an enterprise solution when I was on my talk. Is that cool?” He was like, “Actually, we try not to pigeonhole ourselves as a testing technology anymore. We’re more of a web optimization suite.”
When he said that, I started looking at the development of products as they were coming out. That’s really where things are going. They’re going to try to connect all these tools at once. I think people are using several different tools in their day to day. Whether it be programs on your computer like Excel, Word, PowerPoint. And then anything else you’re using like your email, as well as your project management file sharing. Everything.
I think that there’s stats out there that say people really can’t use more than five tools effectively.
Roger Dooley: Sort of a Dunbar number for tools, I guess.
Justin Rondeau: Yep.
Roger Dooley: Interesting.
Justin Rondeau: I think there’s also going to be, I’ve been talking to some people at Digital Marketer, and A/B testing is a cool facet of optimization but it isn’t optimization proper. I think more people are going to understand that. Now we’re instead of just talking about cool split tests people have seen, people are really starting to talk about why did I do that. And what does this data mean?
It’s kind of like there’s a data science renascence happening, I think. Testing is going to take, it’s going to become more of a back end. Like it’s a duh scenario. We need to be doing that. I think there’s going to be more discussion about data management, data analysis and the different tools out there, outside of testing, that you can use as a conversion rate optimizer.
Roger Dooley: Right. I really see the integration of even individual behavioral data coming along. I mean, we’re at a point where not a lot of companies can do that but more and more companies are collecting this individual data. Their behavioral data, or even in some cases sort of psycho-graphic data about the customers. That if you combine that with some of the other optimization tools, theoretically you could really do a great job both of converting but also serving your visitor, your customer better.
I mean, people want stuff that is relevant to them, and if you can do a better job of that, then it’s a win-win. Let me remind our audience, we’re speaking with Justin Rondeau, Conversion Optimization Manager at Digital Marketer. You’ll be able to find links to all the resources we’ve talked about. Also, a text version of this conversation at rogerdooley.com/podcast.
So, Justin, how can people find your stuff online and connect with you?
Justin Rondeau: Oh, yeah. People can find me on LinkedIn, now that they have those fun vanity URLs, I believe it’s JTRondeau. That’s also my Twitter handle. I’m a contributor on the Digital Marketer blog, I write about split tests we run and different- I’ll actually be producing a lot of interesting content for what hopefully will be a book we release at the end of the year. So we have a lot of cool stuff going on, but yeah, check me out on Twitter JTRondeau. Digitalmarketer.com/blog and Linkedin.com/in/JTRondeau.
Roger Dooley: Great, we’ll have all that on the show notes page. Justin, thanks for being on the show.
Justin Rondeau: Thank you so much Roger.
Thank you for joining me for this episode of the Brainfluence Podcast. To continue the discussion and to find your own path to brainy success, please visit us at RogerDooley.com.