Page 3 of 5 FirstFirst 12345 LastLast
Results 21 to 30 of 45

Thread: My new website

  1. #21

    Join Date
    Jul 2005
    Posts
    953

    Re: My new website

    Quote Originally Posted by paul stimac View Post
    Thanks for all the advice.
    It's not made with frames but definetly by someone who knows nothing about building websites (me).
    You may not know it, but it is made using html framesets. The problem with these is that the page is made up of several different files which get individually indexed by search engines. That means the links from the search engine will load part of a page and not the containing frameset which means the page won't display properly. However, there is javascript in your code which traps this and always sends it to the home page when only a part page is loaded. That means that if a specific image is indexed and linked to from a search engine, then that is not what the user will see. They will get the home page. This is why frames are not recommended.

    BUT, since your site containes no text, there is nothing for a search engine to index. And since the image file names are not meaningful then even google images won't show them in a search.

    In short this means you will get about zero visits from search engine visitors. This means visits must be generated from elsewhere somehow.

    The advice to have it rewritten is fine and dandy from a purists point of view BUT if you don't address the issue of indexing at the same time by building it with meaningful text on each page including the image pages, then you will get zero return on investment since if the hits are coming from targetted traffic, standards and anally correct code won't make the slightest difference.

    So, before you get sold the idea of dropping some money on having it redone, get some guarantees on return on investment since as far as I can see, you have already done the work and a rewrite won't achieve anything unless it takes the site up a level or two in terms of generating hits.

    Another thing to consider is that having it rewritten in standards compliant code probably means you won't be able to edit it in your current wysiwyg editor without trashing the code. That means you would have to edit the raw files in a text editor. Are you happy to do that because it means that if you make a mistake you will need to understand the code to be able to spot the error?

  2. #22

    Join Date
    Dec 2005
    Location
    Southern California
    Posts
    2,736

    Re: My new website

    Quote Originally Posted by robc View Post
    The advice to have it rewritten is fine and dandy from a purists point of view BUT if you don't address the issue of indexing at the same time by building it with meaningful text on each page including the image pages, then you will get zero return on investment since if the hits are coming from targetted traffic, standards and anally correct code won't make the slightest difference.

    So, before you get sold the idea of dropping some money on having it redone, get some guarantees on return on investment since as far as I can see, you have already done the work and a rewrite won't achieve anything unless it takes the site up a level or two in terms of generating hits.

    Another thing to consider is that having it rewritten in standards compliant code probably means you won't be able to edit it in your current wysiwyg editor without trashing the code. That means you would have to edit the raw files in a text editor. Are you happy to do that because it means that if you make a mistake you will need to understand the code to be able to spot the error?
    Standards-compliant CSS-driven code will make a world of difference in at least the following ways:

    - Make the site searchable

    - Make the site accessible to people with dissabilities

    - Make the site bookmark-able

    - Make the site printable

    - Make every page 30-60% smaller byte-wise by separating all the styling and positioning instructions into a separate, central CSS file that gets loaded only once for the entire site and as a result, make the entire site much faster to load, even for visitors on a dial-up and hence much more responsive.

    - Make the site easy to maintain, the individual pages contain only the structural (X)HTML, but no tables, font tags, spacer gifs and what not. Even the latest Dreamweaver could handle those pages, if carefully optimized. Forget about the FrontPage, of course.

    - Make site easy to update or refresh style-wise.

    - Last but not least, it will make the site look professional. There is really no more sense in trying to present top-notch photography using sloppily written, buggy website than trying to showcase it in a newspaper.

    As for search engine optimization, anybody claiming they can guarantee the search engine placement is basically selling you some prime beachfront property in Arizona.

    Search engines keep their methods secret and they keep their criteria changing all the time. The only thing that can get you on their good side is to make your code friendly to them.

  3. #23

    Re: My new website

    Robc,

    I'll try to add invisable text to the page so it can be indexed? Will that help? I'd do that to the images too but they are not on seperate pages. I do have an option to use multiple frames, which I didn't use. Images are all layered on one page. Triggering the link shows the referenced graphic and hides the others. Maybe when the code is generated frames are made and seperate files for each image is created? I'll take a look at what files are being uploaded to see if that's happening.

    I'm also going to add information about the images and a short bio (as other's have suggested) That was my origional intent - just didn't get that far yet. I'm also going to fix the picture quality. I thought that the program I was using took care of that.... I just found an option for me to do it manually.

  4. #24

    Join Date
    Jul 2005
    Posts
    953

    Re: My new website

    Quote Originally Posted by paul stimac View Post
    Robc,

    I'll try to add invisable text to the page so it can be indexed? Will that help? I'd do that to the images too but they are not on seperate pages. I do have an option to use multiple frames, which I didn't use. Images are all layered on one page. Triggering the link shows the referenced graphic and hides the others. Maybe when the code is generated frames are made and seperate files for each image is created? I'll take a look at what files are being uploaded to see if that's happening.

    I'm also going to add information about the images and a short bio (as other's have suggested) That was my origional intent - just didn't get that far yet. I'm also going to fix the picture quality. I thought that the program I was using took care of that.... I just found an option for me to do it manually.
    hidden text is problematic because it can get you get banned from search engines although this is over hyped. I've had one banned but it took google 3 years to do it. It was back within a week after removing the offending code. But since you are using frames, you can add a <noframes> section in which you can put some text. There are very very few browsers out there which don't support frames so what is in the noframes section won't be seen by more than an occasional visitor. The question is, do search engines index what is in a noframes section? Probably but I'm not sure. However, they should NOT get you banned providing you don't stuff them with word lists.
    BUT it is a better policy to make all text visible since if a user does a search and follows the link, they expect to see something about what they have searched on. If they don't, they just leave the site because it wastes their time. i.e. show them what they looked for.

  5. #25
    Seattle photographer Photomax's Avatar
    Join Date
    Sep 2006
    Location
    Seattle
    Posts
    135

    Re: My new website

    Great points Marko.

    Search engine "spiders" work in several ways. Knowing how to exploit a few of their characteristics will go along way to achieving search engine "success."

    In no particular order:

    *Clean code: All html code flows downward, so the "spiders" will move downwards as well. Having tons of useless code in the upper regions of the document makes it harder for the "spiders" to wade through it all. Lots of javascript and bloated table structure can also be a problem. Lean XHTML and an eternal CSS file will really help here.

    *Meta tags: proper meta tags and data really helps

    *Proper page title: amazing how many sites screw this up

    *Appropriate H1 headlines and first paragraphs: commonly used fluff like; "Welcome to my new website! Thanks for stopping by. Please enjoy your visit and contact me with any queries etc, etc..." is all totally useless. That first paragraph and headline is more important to the search engine than it is to the reader. Both need to display direct info on what the site is, what the goals for the site are etc. For example: "Best Bug Photography: Chicago's best source for orange insect photography."

    In general though: frames do a lousy job at this. Flash design does a bad job as well: the content is not "searchable." Complicated table design is searchable but all that extra code makes it more difficult. Tables and cells should be used for displaying tabulated data and NOT for page layout design.

    Cheers,

    Max

  6. #26
    Seattle photographer Photomax's Avatar
    Join Date
    Sep 2006
    Location
    Seattle
    Posts
    135

    Re: My new website

    Moving to Web Standards is becoming a slow moving tidal wave. Sometimes it is the largest "computer savvy" companies that are the slowest to adopt.

    A recent book by Charles Wyke-Smith made a great point with an example.

    Just one snippet of code from Microsoft's page from a year ago:

    <table cellpadding="0" cellspacing="0" widths"l00&#37;"
    height="l9" border="0" ID="Table5">

    <tr>

    <td nowrap="true" id="homePageLink"><></td>

    <td><span class="ltsep">|</span></td>

    <td class="lt0" nowrap="true" onmouseenter="mhHover('loca
    IToolbar', 0*2+2, 'Itl')" onmouseleave="mhHover('localTo
    olbar', 0*2+2, 'lt0')"><a href= "http: //go.microsoft.com/?linkID=508110">
    Home</a></td>

    <td><span class="ltsep">|</span></td>

    <td class="lt0" nowrap="true" onmouseenter=="mhHover('loca
    IToolbar', 1*2+2, 'Itl')" onmouseleave="mhHover('localTo
    olbar', 1*2+2, 'lt0')"><a href="http: //go.microsoft.com/?linkID=317769">Subscribe</a></td>


    <td><span classs"ltsep">|</span></td>

    <td class="lto" noivrap="true" onmouseenter=l'mhHover('loca
    IToolbar', 2*2+2, 'Itl')" onmouseleave="mhHover('localTo
    olbar', 2*2+2, 'lt0')"><a href="http://go.microsoft.com/
    ?linkid=3l7027'>Manage Your Profile</a></td>

    <td widths"100%"></td>
    </tr>

    </table>


    All of this code produced just one row of three buttons. 75% of the characters is code bloat, mainly making rollovers. All of this could have been styled and placed into an external style sheet, making the page load faster, easier to maintain and be more "accessible" to browsers and other devices.

    Even Microsoft has seen the light. They have and continue to work on making their web content more Web Standards complaint, even though their popular browser, Internet Explorer 6 is a lousy standards complaint browser(its a bug ridden mess!)

    Is all of this discussion straying too far from Paul's pictures and good photography? I don't think so. Photography is visual communication. The internet has become one of the most powerful and effective means of displaying this kind of content. Doing so efficiently is a worthy topic of study...

    Max

  7. #27

    Join Date
    Jul 2005
    Posts
    953

    Re: My new website

    well just for fun I did a google search on

    fine art photography

    None of the first 10 sites returned validate as valid html. The only one that came close was wikipedia. And wikipedia was the only one that attempted to use consistent coding.

    The other nine break all the rules we are told it is so imporatnt to obey. They have large amounts of embeded javascript. Mixed embeded attributes definitions and styles and range from a dozen to 136 coding errors.

    Funny thing is that google doesn't seem to care cos it puts them in the top 10 position.

    At the same time we are told that you won't get any guarantee about high position but if you don't have a high position then what point is there in spending a bucket load of money on website that no one will see.

    make up your own mind.

    even this website doesn't validate but its number one for large format photography.

    fact is that its pure crap being spread as a marketing method for telling you how bad the competion is. Personally I'd rather have a developer who can get me page one position rather than some anal designer who is only capable of quoting standards and won't guarantee anything.

  8. #28

    Join Date
    Dec 2005
    Location
    Southern California
    Posts
    2,736

    Re: My new website

    Quote Originally Posted by robc View Post
    fact is that its pure crap being spread as a marketing method for telling you how bad the competion is. Personally I'd rather have a developer who can get me page one position rather than some anal designer who is only capable of quoting standards and won't guarantee anything.
    Well, some of us have witnessed firsthand how good your developer was a little while ago. And you keep reminding us just how good your copywriter is too...


  9. #29

    Join Date
    May 2006
    Location
    Kaneohe, Hawaii
    Posts
    1,390

    Re: My new website

    Quote Originally Posted by Photomax View Post
    Moving to Web Standards is becoming a slow moving tidal wave. Sometimes it is the largest "computer savvy" companies that are the slowest to adopt.

    A recent book by Charles Wyke-Smith made a great point with an example.

    Just one snippet of code from Microsoft's page from a year ago:

    <table cellpadding="0" cellspacing="0" widths"l00%"
    Keep in mind, Microsoft's website is huge, and will take a long time to make it standards compliant. I see this already, where some parts of microsoft.com are, and some aren't.

    Older Microsoft tools, like Visual Studio 2003 allowed the user to override a lot of the newer standards. However, in newer versions, like Visual Studio 2005, XHTML Transition is the default, although you can set it to "Strict", and it will give you an error if your code isn't complaint. One of the things I have struggled with, since I am more used to writing HTML the old-fashioned way. But, the new version of my website will be much more complaint and CSS driven.

    I'm not going to say that IE6 isn't very compliant, I will say that it is very forgiving. People talk a lot about compliance but don't realize that the standards the compliance is built upon is loose by design - remember, the old HTML stardard was defined by committee. There is enough latitude in the old standards to allow this.

    The new Microsoft tools, by default follow the newer standards, simply because they are based upon XML which is much more strict than HTML. For example, in the old HTML world we put a break in as <br>, but in the new XHTML standard it must be <br />. This is simply, because every XML tag must have a closing tag.

  10. #30

    Join Date
    May 2006
    Location
    Kaneohe, Hawaii
    Posts
    1,390

    Re: My new website

    Quote Originally Posted by robc View Post
    None of the first 10 sites returned validate as valid html. The only one that came close was wikipedia. And wikipedia was the only one that attempted to use consistent coding.
    It takes time for things to change. A lot of people don't even know there is a new standard, and many still use older tools that generate old HTML.

    I think you are missing the real purpose behind the standards; they are primarily to make the site easier to maintain and to allow the user greater control over what they see. For example, moving table attributes out of the code, into a CSS file means that you can change the "look and feel" of all tables on the site, simply by changing one CSS file, not every instance of the <table> tag. See Marko's excellent post above for clarification.

    Javascript is a red-herring. Most search engines ignore them anyway. FWIW, with the rise of technologies like AJAX you are going to see more javascript, not less.

Similar Threads

  1. Lovely Website But...
    By Frank Petronio in forum Business
    Replies: 14
    Last Post: 6-Oct-2006, 07:52
  2. Professional Website using Templates?
    By Ben Chase in forum Business
    Replies: 71
    Last Post: 4-Oct-2006, 15:26
  3. Article Submission Sites for Website Promotion
    By Brian Vuillemenot in forum Business
    Replies: 8
    Last Post: 3-Apr-2006, 17:03
  4. question about my website
    By Patricia Langer in forum Business
    Replies: 23
    Last Post: 2-Dec-2005, 18:02

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •