Performance Marketing Blog A blog on performance marketing maintained by marketing professionals. 2009-07-17T05:18:52Z WordPress http://performancemarketingblog.com/feed/atom Gideon Rubin http://www.simplyideas.com <![CDATA[The (In) Complete Start-Up Internet Marketing Guide]]> http://performancemarketingblog.com/?p=273 2009-07-17T05:18:52Z 2009-07-16T17:13:28Z For my fellow world inhabitants at this time of global economic turmoil, we must help each other, now more than ever.

A friend of mine recently lost his job. He was smart enough to see the writing on the wall and make some plans before his employer went under, but it has still had a major impact on his economic well being. As we watch the news stories of companies getting multibillion dollar bailouts so that their CEO’s can fuel up their private jets, others are suffering.

So how can we make a difference? Well we can use whatever expertise and services or products we provide to help others. We are going on the “Teach a man to Fish” principle, so let’s go fishing everyone!

This (In) Complete Startup Internet Marketing Guide is my way of doing just this. The guide is not meant to be a comprehensive internet marketing manual, instead it is meant to teach real people the hands on how to’s of internet marketing. For example you will learn how to pick a valuable domain name, or how to find targeted keywords for a website, or how to get free search engine traffic.

Over the coming months I invite you to contribute, share and help build this resource and your own internet business through this blog. I will provide the framework and your comments, tips, articles and twitter posts will provide the life to this project. Let’s jump right in and start the step by step notes on building your business online. We are going to assume that you have no current site and start with the basics.

Pick A Topic Or Niche

In the early days of the Internet you could just pick some major industry or topic and start your website. Those days are over. Now you have to think niche to be successful. I recommend you pick something you are interested in and will enjoy researching and writing about. There are some market based numbers to consider; mainly you need something that has a large enough market appeal that you can make money yet is small enough that you won’t be competing with tons of highly successful businesses already on the web. You can target the top markets but make it your own, find your own niche.

Here we go. Pick a topic that is general and that you are interested or at least willing spend a lot of time researching and writing about. I suggest something such as a hobby, maybe extreme sports or deep sea fishing if you are so inclined.

Keyword Research For The Search Engines

Now you need to do some research to figure out the exact area within the niche you should focus on. There are many ways to learn how to pick good keywords for your Internet marketing.

Rules For Target Search Engine Keywords

1. The longer the keyword phrase the fewer number of searches and the less competition.

2. Pick a topic that has a total of at least 10,000 searches per month across the main keywords you target.

3. Find 1-3 top keywords, 5-8 medium targets and up to 20 total in your target basket.

4. You want to pick terms, at least for your basket of keywords that have high search volume and low competition.

You may want to take a minute to review this list of keyword tools. Let’s keep it easy and go with one tool that we are going to use throughout our marketing efforts. Download and install this SEO Toolbar for FireFox from SEO Book. This keyword tool will save you a ton of time in the long run. In general I am not a big fan of plug-ins but I have to give it to www.SEOBook.com they have done a great job with this one.

Once you have the tool bar set up, go to the keyword entry box on the right and input your most basic keyword phrase for your topic or niche. For example let’s put in “deep sea fishing”. You can customize which windows and tools will open but let’s leave it on the default ones.

Three pages should open:

1. SEO Book Keyword Suggestion Tool

Take a look at the Overall Daily Est result for “deep sea fishing” it shows 270 (as of July 12, 2009). This means there are about 270 searches per day across the search engines for the exact phrase of deep sea fishing. This is not bad for a 3 word phrase. Keep in mind although you may be targeting and writing about deep sea fishing, you will also have a basket of keyword that you target in addition to this one term.

2. Google Adwords: Traffic Estimator

On this page you want to take a look at the PPC value for your main target phrase. Keep in mind this is just an estimate and it is for people who want to pay Google for the keyword not what Google will pay its publishers who get clicks for the term. A publisher is what you will be called when you run advertising for other companies on your web site. You want to find keywords that cost a minimum of about $1.00 per click. In this case our phrase is estimated to cost from $.64 to $1.05 for advertisers.

3. Google AdWords: Keyword Tool

This is where you measure SEO Book’s keyword estimator tool vs Google’s. Local Monthly Search Volume is estimated at 368,000 and Global Monthly Search Volume is estimated at 201,000 searches. As you can see there are plenty of searches on this term. You should also check out the Advertiser Competition bars, in our case the main term (“deep sea fishing”) is competitive.

As you can see there are some significant differences between the tools and their estimates. Don’t worry too much about this right now, basically this is the reason you need to use more than one tool.

It is a good idea to download all the information you just researched if you think you found your main target keyword, if not rinse and repeat. In addition to a single topic or niche phrase you need to pick 5-8 top target related keywords to use as categories and a total of about 20 keywords to use in articles and long-term comprehensive marketing efforts. The top 5-8 keywords should follow the guidelines above, while the additional keywords to fill out your 20, can be ones you think are important, relevant or of interest to you.

Pick Your Domain

This will become your brand. Most search marketing experts estimate that the domain name has about a 20% impact on your ability to garner relevant search engine traffic. For sites in less competitive niches you may be able to get significant share of search traffic once you begin your marketing efforts, if you have selected a good domain name.

Rules for Selecting a Domain name:

1-9. These rules are covered well on SEOmoz. You can pretty much ignore rules 9-11. Rule 9 is generally correct, but for some strong keyword domains, up to two hyphens may work well. In other words if the keywords are very high volume and the hyphen version is available, you can consider it along with other options.

10. Keep domain names as short as possible. #6 in the SEOmoz article but worth repeating.

11. Put the most important words first. This usually means put the keyword with the highest volume of searches in the beginning of the domain.

12. If you can’t find what you are looking for try adding location words or popular keywords. For example if “deepseafishing.com” is not available, it is probably better to buy something like “floridadeepseafishing.com” rather than unrelated like “deepseafishingtoday.com”.

It may be easiest to register your domain at your hosting company. If you plan on purchasing a lot of domains you can get a bulk registration account at a place like GoDaddy.com.

Hosting

There are many options for hosting. For the purpose of this Internet Marketing Guide we will provide two suggestions that we have experience with.

Proper Hosting

This green web hosting company provides 1-click Word Press Blog install. Also I have a deal set up with them to provide their clients free microsites for use with this guide. Maybe most important this company has the capability to put each domain on a separate ip address by request. We are going to recommend you start a few sites on different niches later so you can see which ones work best. They also provide cPanel. I have secured a special deal because of their mention here, use promo code: InternetMarketing.

Host Gator

They provide good hosting for a good price and have all the basic tools you would expect including cPanel Administration so you can manage your account on a very detailed level. When you are ready to build multiple sites they have a specialized SEO Hosting Service that some may find useful. I have had issues where the shared server is overloaded and slow. Overall, they are generally dependable.

(In) Complete Start-Up Internet Marketing Guide – Post 1 of ? (many)

Gideon Rubin is the CEO at Simply Ideas LLC and also serves as an advisor for several start-up websites including branded URL shortener 9mp.com and Twitter introduction engine, Twitroduce.com. Follow him on Twitter or connect with him on LinkedIn.

]]>
2
Joshua Odmark http://simplyideas.com <![CDATA[How to SEO on Twitter: Keywords Keywords Keywords]]> http://performancemarketingblog.com/?p=181 2009-05-19T23:24:04Z 2009-05-18T18:48:16Z There Are Two Types of Twitter Searches

There are two types of keywords on Twitter. The actual keyword as it exists, such as “college” and the hash tag “#college” version of the keyword.

Searching for either of these will display two different sets of results.

It goes without saying that the natural method of search is to type in the keyword without a hash. If someone is interested in tweets about Star Trek, they are more likely to search for “Star Trek” than “#startrek”.

Because there are two ways to perform this search, there are two ways to optimize tweets for these searches. You can optimize for the hash tag version and the actual version of the keyword.

For this article, I am going to focus on how to optimize for the actual version of the keyword, the one without the hash tags.

Keyword Search Frequency

In my research, I found that there is more competition in keywords without the hash tags. I performed a search with the hash tag and without it. I noticed that the time between tweets is much smaller without the hash tag, which proves that the keyword appears on a higher frequency.

This means that the competition to rank for this keyword will be much higher. Which means as a performance marketer, you have to keyword optimize every opportunity you are given.

Keyword Optimization

Keyword optimization in general is a well-covered topic. But I am going to introduce a method that is not currently being utilized on Twitter. But you can rest assured that as soon as people become aware of this technique it will spread like a wildfire.

A huge missed opportunity is in the shortened links people use on Twitter. Why do we post these links? We post these links to drive traffic to them. What if we could create keyword rich shortened links as easily as we create them now?

Of these two links, which has more SEO value?

http://9mp.com/t7x

http://college.9mp.com/t7x

From a purely cosmetic point of view the link with “college” is generally preferred because it gives insight into what the link is about.

There is an even greater value in including keywords in your shortened URL’s. Yes, you guessed it, through the keyword that is in the URL. Twitter takes these keywords into consideration in their search results.

Want proof? Here is an update I posted today from my personal Twitter account @JoshuaOdmark as a test:

Ashford University ranks high on the list for an online school. Checkout their listing here: http://college.9mp.com/t7x

Twitter Search Results

As you can see, the keyword “college” is not anywhere in the tweet except for in the URL. Here is a search result after this message was posted:

The 2nd result shows that my tweet showed up for a keyword that only existed in the shortened URL.

This proves that the keywords in a shortened URL are a huge opportunity for SEO on Twitter.

How To Get Branded Shortened URL’s

I recommend using a URL Shortener such as http://9mp.com because not only does it offer detailed statistics, free usage, vanity url’s, and all the features you are used to with the hundreds of URL shorteners out there, but you can also brand your URL’s for free.

http://9mp.com gives you the option of branding your URL’s with any sub domain you wish. It also allows you to tag on keywords at the end of your URL for additional SEO on Twitter.

http://college.9mp.com/t7x/education

This URL gives you two extra keywords to maximize your SEO on Twitter.

If you are into any form of performance marketing from selling purses online to music album promotion, and you are not using branded URL’s, you are missing the biggest opportunity available on Twitter right now.

Contact Information

Contact a representative at branding@9mp.com to receive information on how to secure your own custom branded URL. It took less than 10 minutes for my custom branded URL to be setup and available for usage inside of my 9mp.com account.

]]>
8
Joshua Odmark http://simplyideas.com <![CDATA[Fighting Twitter Spam – Twitter Will Need Help]]> http://performancemarketingblog.com/?p=155 2009-05-20T00:24:27Z 2009-05-11T19:24:51Z I recently had someone follow me on twitter that had 39,000+ friends (in other words, they were following that many people).

How can someone possibly follow that many people? Clearly they are not on twitter to follow 39,000 people. Their theory is that if they can follow the maximum amount of people that Twitter will allow them to follow, that a percentage of them will follow them back. Thus increasing the amount of people that are following their twitter account. Which gives them an audience to broadcast their “tweets” to.

This is spam. This is devaluation.

An associate recently sent me a link to tweepme.com.  If you join this site, they promise to increase your followers into the thousands in a very short amount of time.

This is spam.

How can they do this? The idea behind this site is for every member that joins the site, all the previous members will follow this person. Thus, as the site gets larger from members signing up, the “benefit” they can provide the registered user increases. This is a good model considering the reason they are on the site is to increase their followers.

This is devaluation.

A great benefit of Twitter is the ability to gain connections to like-minded individuals that allows you to share short bursts of communication that benefits the followers of the individual sending the transmission.

Services such as TweepMe are popping up all over the Internet with the purpose of gaming the Twitter system for some sort of measurable gain.

Twitter’s largest problem is going to be combating spam. It is the same problem Google is facing and will continue to face.

One of the problems is how easy it is to game the system with the Twitter API. In just two hours, I was able to replicate Twollow.com’s service and setup a system of cron jobs that effectively maxed out the Twitter API daily limitations. I can autofollow and then unfollow people over a certain period of time. My script will autofollow people based on search results. It is a very handy script that allows me to create a valuable twitter account automatically. I am not using it to game the system by gaining as many followers as I can, I am using it to find people who WILL be interested in my tweets.

Part of the problem is that twitter gives anyone the ability to gain access to search results on a massive scale on a daily basis. By logging these “tweets” from people who display them publicly (anyone on Twitter), I can create a massive amount of relationships between the tweets and myself.

This opens the door to many people who would consider black hat tactics.

Personally, I think the Twitter API is fantastic. I love spending time playing around with all of the data that can be retrieved through their API. It truly is inspiring to pull the data locally and manipulate it into something that can be useful. For example, I can pull tweets relating to backpacking in Washington State to compile a backpacking in Washington State resource. With innovative search techniques, I could build a valuable resource for anyone interested in that niche. Now plug in any topic you can imagine into the above example.

When all the cards are on the table, I believe Twitter’s biggest problem will be combating spam. Facebook is going to run into the same problem as Twitter. And Google has already been fighting this problem for years.

I remember a distinct point in time where I started to notice the Google Results being less accurate due to people spamming their way to the top.

It is only a matter of time before Twitter is taken over by spam.

When there were reports of Twitter being purchased, I instantly thought of Google. If spam is the one thing that could topple Twitter, it would make sense to strike a deal with the company who has the best chance at beating it.

Twitter spam has been minimal up until this point, but you WILL see it become an issue yet this year.

How Twitter handles it will make or break them.

Here are a couple resources aimed to stop the twitter spam:

Stop Twitter Spam

Twitter Blog About Spam

TechCrunch on Twitter Spam

Mashable on Twitter Spam

]]>
2
Joshua Odmark http://simplyideas.com <![CDATA[The Twitterlution!]]> http://performancemarketingblog.com/?p=125 2009-05-11T23:19:36Z 2009-04-27T23:02:52Z rev⋅o⋅lu⋅tion
–noun
a sudden, complete or marked change in something: the present revolution in church architecture.

Don’t even bother googling it, it doesn’t exist. I just made it up.

After spending a little time watching the evolution of Twitter, which quickly followed the revolution of media, I find myself blown away at the incredible rate at which innovation is occurring. Each new week represents a different week than the past one. A new product, service, or feature pops up almost seemingly daily.

The actual idea of Twitter seems somewhat useless to me as it stands now, or atleast it is a fad. But something that grows as quickly as Twitter and gains a following the size the Twitter has, means that there is clearly something to it.

Facebook should also be mentioned on this note, because it to, has grown rapidly.

What have both Twitter and Facebook given us? They have inspired innovation into those who thought they could not be innovative. Every day I spend searching Facebook or Twitter yields another creative individual who has reached out in someway, shape, or form. Which is truly incredible if you think about it.

As social media uses the Internet as its medium, it reminds me of what the printing press did for the distribution of literature. By taking an author and directly connecting him/her to a reader, it is creating an intimate connection between the two that could not possibly have had the same effect if it required an intermediary. Nothing is lost in translation.

Social media gives the “little guy” the means to overcome the various barriers to entry that exist in society as a whole. Whether it be lack of opportunity, lack of wealth, or the absence of a voice, social media reestablishes the dialogue between author/reader and creates a real-time dialogue.

The rapid rate of innovation in the social media market, due in part to services like heroku & herokugarden, seems to be accelerating when an application can be built quickly thanks to the Twitter API.

Sometimes when life is coming at you so fast, you can miss quite obvious things. Today I realized that the innovation that is happening online right now is unprecedented. All across the world people are connecting together through all sorts of innovative ideas sprung from the social media platform. The most exciting part is the fact that this is just the beginning.

I predict that a companies social media standing will have an undeniable affect on the repor that they hold with their customers. Which is why AT&T will be the last company to give into social media, the top dogs in the auto industry won’t even get a chance to take a swing, and @oprah will turn into a phenomina.

]]>
1
Joshua Odmark http://simplyideas.com <![CDATA[URL Shorteners: What is all of the fuss about?]]> http://performancemarketingblog.com/?p=106 2009-05-07T20:58:35Z 2009-04-21T04:16:11Z Last week there was a plethora of coverage for URL shorteners just after bit.ly landed 2 million in vc funding.

When it was announced that bit.ly landed some investment money, the peanut gallery came alive.

As is my custom, I began to research URL shorteners (because I enjoy improving upon new ideas). I quickly realized how easy it is to create a URL shortener. I may have even created one myself, who knows. But more to the point, taking a long URL, and shortening it is a rather straightforward concept. Create a random 3 character hash that is tagged to the original long URL in a database with an auto incrementing id, and boom, you have a URL shortener. If you’re running apache, a simple rewrite rule will achieve the redirection.

As you can see, in a manner of a single day you can create a rather robust service that is now in high demand due to the popularity of Twitter which is now mainstream thanks to @Oprah.

When there is no barrier to entry, you can expect competition to grow like a wildfire. Which is exactly what has happened. I counted over 90 active URL shorteners in my research.

It was only a matter of time before a few of them stepped out from the crowd with additional features. Which is exactly what cli.gs, bit.ly, tr.im, and my favorite, digg.com, have done.  I like all of the url shorteners listed, but digg was the first to truly release an innovative feature, which was the digg bar.  Everything else lacks differentiation.

A majority of SEO gurus and webmasters have expressed concern over URL shorteners due to them essentially adding a step in the peer to peer relationship of a user posting a link, and a visitor clicking the link. Furthermore, because you have shortened the link, the visitor can no longer mouse-over the link to see where it points to.  I used to live by a strict code of ethics on the Internet.  If I couldn’t tell where a link was pointing, I wouldn’t click on it.  I have been forced to adapt because the only links you find on Twitter are the shortened ones.  The risk is clicking a link that has some sort of malicious intent.  The last thing I need is a hijacked computer due to some new super-virus.

Once you get past the fear of not knowing where the link will take you and accept that the relationship you have with the creator of the shortened URL is reliable, you can begin to enjoy all of the resources Twitter can bring to you (with the right followers).

Speaking of which, I saw @aplusk on @Oprah, and I was thoroughly impressed with how he explained his relationship with Twitter. I am probably one of the few people that would ever quote a celebrity, but if I feel it is warranted, I will. And when Ashton spoke of how he has a direct relationship with his followers which allows him to communicate with them directly, I saw an instant connection there. If we analyze this connection, it is a relationship of sharing pieces of information. By limiting yourself to 140 characters, it forces you to get to the point of what you would like to say.

I am following a number of people on my twitter account @JoshuaOdmark, many of which I enjoy reading what they have to say.  @mattcutts generally posts extremely valuable information on his tweets that include an outbound link.  The majority of the time I am bookmarking or adding things to my google account based on these links.  Social media seems to be an extension of the same power blogs bestowed upon the blogger.  It creates a direct relationship which completely solves the age old problem expressed best in the “Telephone Game”.  Create a chain of 10 people, whisper something from one end of the line to the other, and it will not be the same, guaranteed.

Now that we can see the value of a direct relationship, EVEN if it is one-sided, meaning, I can see @mattcutts Twitter updates, but he cannot see mine, is irrelevant to my point, the actual relationship of his authority on a webmaster topic and my desire to obtain that knowledge, creates an invaluable exchange.

Imagine if you will, if the medium in which we communicate, in this example, Twitter, has the capability of managing these relationships in such a way as to bring this authoritative information to the masses. They are involved from start to finish, all it takes is a little imagination and innovation to see a bigger picture.

URL shorteners are no different.

A grand innovation will occur in this market, you can bet on it.

]]>
0
admin http:// <![CDATA[Landing Page Optimization Breakthrough]]> http://performancemarketingblog.com/?p=100 2009-05-22T19:33:58Z 2009-03-26T02:47:21Z I came across this video discussing what appears to be a real breakthrough in landing page optimization.

It sounds like this new tool from OnDialog will move the development of optimized landing pages from the IT department to the marketing department. With tools in the marketplace like Omniture’s (formerly Offermatica) Test & Target and Google’s Web site Optimization there are some heavy hitters that OnDialog will have to compete with. The Beta launched this week so let’s keep an eye on this and see if the competition will come out with some similar tools to allow the marketer to get new landing pages up and running without much technical skill.

]]>
0
Joshua Odmark http://simplyideas.com <![CDATA[Why your website needs a sitemap and how you should build it.]]> http://performancemarketingblog.com/?p=82 2009-02-26T21:04:47Z 2009-02-26T21:04:12Z First and foremost, there is a website that describes in thorough detail how to build a sitemap. It is located at http://sitemaps.org/.

The purpose of this blog is to shrink down their “protocol” aka, instructions, to a short and sweet version with commentary.

These instructions are intended for sites with less than 50,000 web pages.

You will be creating and/or editing three files.

  1. sitemap_index.xml
  2. sitemap.xml
  3. robots.txt

All of these files will be located in your website’s root directory. This is generally the same place that all of your public Internet files are located (commonly referred to as ‘public_html’).

First, we will create our sitemap_index.xml file.

The proper formatting for this file is as follows:

<sitemapindex xsi:schemaLocation=”http://www.sitemaps.org/schemas/sitemap/0.9  http://www.sitemaps.org/schemas/sitemap/0.9/siteindex.xsd”>
<sitemap>
<loc>http://www.your-website-domain.com/sitemap.xml</loc>
<lastmod>2009-02-23</lastmod>
</sitemap>
</sitemapindex>

The above example is referred to as a Sitemap Index. The purpose of this is to allow you to include multiple sitemaps for your website. This is useful for websites who would like to utilize multiple sitemaps, such as a video sitemap, or any other type of sitemap.

Open up your favorite html editor and paste the above into the editor (I prefer Dreamweaver PC & MAC and TextMate MAC). Make sure to change “www.your-website-domain.com” to your own website domain.

Once you have done this, save the file as “sitemap_index.xml“.

Now you are ready to create your sitemap.xml file.

The formatting for this file is as follows:

<?xml version=”1.0″ encoding=”UTF-8″?>
<urlset xmlns=”http://www.sitemaps.org/schemas/0.9″>
<url>
<loc>http://www.your-website-domain.com/</loc>
<lastmod>2009-01-01</lastmod>
<changefreq>daily</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>http://www.your-website-domain.com/contact.php</loc>
<lastmod>2009-01-01</lastmod>
<changefreq>daily</changefreq>
<priority>0.6</priority>
</url>
</urlset>

You will notice that there are two <url></url> values in the above code. This indicates there are two URL’s in this sitemap. One has a priority of .8, and the other has a priority of .6. The higher the priority the more importance the search engine will put into the URL (it is rumored and common belief in the industry that this has no affect on SERPS).

Duplicate the values between the <url></url> tags (changing the appropriate information) until all of your website pages have been included in this file. This can also be auto-generated via a programming script.

Save this file as “sitemap.xml“.

Now create a “robots.txt” file, which is a simple textfile.

You can use notepad (PC) or TextEdit (MAC) for this.

If your website already has a robots.txt, than you will simply need to add the following line to your file.

Sitemap: http://www.your-website-domain.com/sitemap_index.xml

Not sure if you already have a robots.txt file? Type this into your website browser: http://www.your-website-domain.com/robots.txt

If you do not receive a 404 error from your website, than you do have a robots.txt file.

The above code tells the search engines where your sitemap index file is located. From there, they will pull all of your sitemaps in one shot.

The last step is an optional step for those who are familiar with file compression.

Search engines allow you to compress your large sitemap files with GZIP. The command to gzip is simple, in terminal (MAC), type “gzip sitemap.xml” from the directory the sitemap is in. For windows users here is GZIP.

This will create sitemap.xml.gz, a compressed version of your sitemap. You must change the sitemap_index.xml file to the new sitemap file name. This change will be from “sitemap.xml” to “sitemap.xml.gz”.

The obvious benefit to this is less bandwidth usage when search engine spiders download your sitemap. For one of my websites, I compressed a 7MB file down to 400KB using GZIP.

Once you have done this, upload your files to the “public_html” folder, and you are all done!

To verify that they are there, you may check via a website browser for these files:

http://www.your-website-domain.com/robots.txt
http://www.your-website-domain.com/sitemap_index.xml
http://www.your-website-domain.com/sitemap.xml

]]>
1
Joshua Odmark http://simplyideas.com <![CDATA[Class C IP Explained – What you need to know!]]> http://performancemarketingblog.com/?p=75 2009-02-18T05:07:48Z 2009-02-18T05:07:48Z It is important to understand exactly what the class C IP means, and why it is important.

One of the things that I always preach when it comes to SEO, is to step back, and look at it from a common sense standpoint.

If three of your sites are using the same exact shared IP, and they are linking to each other, it is conceivable that “someone” or “something” could program an algorithm to decrease the value of links if they are on the same IP. Take that a step further, and if the sites are on the same IP range (1-255) than the same thing could potentially apply.

This is where the Class C IP Range comes into play.

If you use the same host for multiple sites, I suggest using this simple tool to check whether your sites are on the same Class C IP. You could potentially be losing link authority due to this.

http://www.webrankinfo.com/english/tools/class-c-checker.php

It is generally cheap to purchase a static IP. Our host, http://www.properhosting.com offers us IP’s for $1. However, a “reason” must be given as to why the IP is needed. Generally they are only distributed for SSL connections. Contact your host for more information about obtaining a static IP for your website.

Here is the Class C IP explained:

An IP has 4 sets of numbers ranging between (0-255).

Example: 74.54.139.178

The CLASS C IP is: 74.54.139

It is the 3rd subset.

So if the following sites have these IP addresses:

SITE A 74.54.139.178

SITE B 74.54.139.43

They have the same CLASS C IP.

If I owned both SITE A and SITE B I would request a different “CLASS C IP” from my webhosting company. They will know exactly what you mean when you make this request, chances are this isn’t the first time they have heard this request.

If you have any further questions, please feel free to leave a comment and I will respond to them.

]]>
1
Gideon Rubin http://www.simplyideas.com <![CDATA[DRTV and SEO]]> http://performancemarketingblog.com/?p=67 2009-02-12T20:44:52Z 2009-02-12T20:44:52Z I was reading some recent articles about DRTV and how to set up and manage a successful campaign. It is important to have both your SEM and SEO already set up before you launch your DRTV campaign. If you do this correctly, your unique urls and branded search will generate the actions you desire instead of sending visitors to your competition.

After experiencing the pain of manually managing over 800 unique DRTV URLs for an Education campaign I came to the conclusion that there must be a better way. There wasn’t, so we built one. We now have the ability to manage, track and SEO optimize hundreds of unique urls with our specialized software. It allows the marketer to not only build these sites easily but also push updates and manage all the pieces end to end without much technical knowledge. Best of all it helps drive highly targeted search visitors to your offer to they convert.

Has anyone else found helpful tools for the convergence of DRTV with online media?

]]>
1
Joshua Odmark http://simplyideas.com <![CDATA[Analyze Your Competitors]]> http://performancemarketingblog.com/?p=59 2009-02-12T17:49:33Z 2009-02-12T17:49:33Z If you’re wondering what your competitors are doing that you are not, spend some time analyzing their practices.

A great place to start is analyzing their traffic at http://www.compete.com. The nice thing about this website is it gives you the ability to compare your website with multiple competitors giving you a side by side analyzation.

This graph displays a plethora of information. A few things this graph shows are, visitor trends, top search keywords, sub domains, trusted domains, visitor engagement, and velocity of growth.

This report is a great way to quickly view trends of your site as compared to your top competitors.

Compete.com

Compete.com

Our own organization uses this chart to determine whether a competitor has launched a major campaign, and if they have, we may choose to compete against their particular campaign or to reduce our spend in a related campaign.

Quantcast.com is another Gem that can help you analyze your website.

I won’t get into HOW quantcast.com has the data that they have, but there is definately some potential privacy issues with the information they obtain through ISP’s. But since that is not the topic of this particular blog, I will digress!

Here is an example of a single report generated by quantcast.com for one of our websites:

Quantcast

Quantcast

The thing that surprises me about this report is because we have not placed the Quantcast tracking software on our website. It just calculated that information through its own sources. “Its own sources” is apparently hundreds of ISP’s throughout the nation.

They do offer a piece of tracking software that can be placed on your website that will most likely improve the accuracy of these statistics.

It should be noted that we did not opt into Quantcast.com so that it would track our website.

]]>
0