e commerce MIDTERM
Fareza Novandro R.
0910223109
see and feel
Minggu, 15 April 2012
Yahoo's Guidelines for Webmasters
How do I add my site to your search engine database?
Yahoo! Search crawls the web every 2-4 weeks and automatically finds new content for indexing. If pages already in the Yahoo! Search index link to your site, it will be considered for inclusion in the next update of the index. Getting your site listed in major directory services such as the Yahoo! Directory and DMOZ is an excellent way to be sure that there are links to your site.
Pages Yahoo! Wants Included in Its Index
* Original and unique content of genuine value
* Pages designed primarily for humans, with search engine considerations secondary
* Hyperlinks intended to help people find interesting, related content, when applicable
* Metadata (including title and description) that accurately describes the contents of a Web page
* Good Web design in general
Unfortunately, not all Web pages contain information that is valuable to a user. Some pages are created deliberately to trick the search engine into offering inappropriate, redundant or poor-quality search results; this is often called "spam." Yahoo! does not want these pages in the index.
What Yahoo! Considers Unwanted
* Pages that harm accuracy, diversity or relevance of search results
* Pages dedicated to directing the user to another page
* Pages that have substantially the same content as other pages
* Sites with numerous, unnecessary virtual hostnames
* Pages in great quantity, automatically generated or of little value
* Pages using methods to artificially inflate search engine ranking
* The use of text that is hidden from the user
* Pages that give the search engine different content than what the end-user sees
* Excessively cross-linking sites to inflate a site's apparent popularity
* Pages built primarily for the search engines
* Misuse of competitor names
* Multiple sites offering the same content
* Pages that use excessive pop-ups, interfering with user navigation
* Pages that seem deceptive, fraudulent or provide a poor user experience
How do I improve the ranking of my web site in the search results?
Generally, you can ensure you have unique and targeted content for your audience. Here are a few tips that can help your page be found by a focused search on the Internet:
* Think carefully of the key terms that your users will search on to find content like yours. Use those terms to guide the text and construction of your page.
* Users are more likely to click a link if the title matches their search. Choose terms for the title that match the concept of your document.
* Use a "description" meta-tag and write your description accurately and carefully. After the title, the description is the most important draw for users. Make sure the document title and description attract the interest of the user but also fit the content on your site.
* Use a "keyword" meta-tag to list key words for the document. Use a distinct list of keywords that relate to the specific page on your site instead of using one broad set of keywords for every page.
* Keep relevant text and links in HTML. Placing them in graphics or image maps means search engines can't always search for the text, and the crawler can't follow links to your site's other pages. An HTML site map, with a link from your welcome page, can help make sure all your pages are crawled.
* Use ALT text for graphics. It's good page design to accommodate text browsers or visually impaired visitors, and it helps improve the text content of your page for search purposes.
* Correspond with webmasters and other content providers and build rich linkages between related pages. Note: "Link farms" create links between unrelated pages for no reason except to increase page link counts. Using link farms violates Yahoo!'s Site Guidelines and will not improve your page ranking.
referensi :
google, nanangsuryadi
Yahoo! Search crawls the web every 2-4 weeks and automatically finds new content for indexing. If pages already in the Yahoo! Search index link to your site, it will be considered for inclusion in the next update of the index. Getting your site listed in major directory services such as the Yahoo! Directory and DMOZ is an excellent way to be sure that there are links to your site.
Pages Yahoo! Wants Included in Its Index
* Original and unique content of genuine value
* Pages designed primarily for humans, with search engine considerations secondary
* Hyperlinks intended to help people find interesting, related content, when applicable
* Metadata (including title and description) that accurately describes the contents of a Web page
* Good Web design in general
Unfortunately, not all Web pages contain information that is valuable to a user. Some pages are created deliberately to trick the search engine into offering inappropriate, redundant or poor-quality search results; this is often called "spam." Yahoo! does not want these pages in the index.
What Yahoo! Considers Unwanted
* Pages that harm accuracy, diversity or relevance of search results
* Pages dedicated to directing the user to another page
* Pages that have substantially the same content as other pages
* Sites with numerous, unnecessary virtual hostnames
* Pages in great quantity, automatically generated or of little value
* Pages using methods to artificially inflate search engine ranking
* The use of text that is hidden from the user
* Pages that give the search engine different content than what the end-user sees
* Excessively cross-linking sites to inflate a site's apparent popularity
* Pages built primarily for the search engines
* Misuse of competitor names
* Multiple sites offering the same content
* Pages that use excessive pop-ups, interfering with user navigation
* Pages that seem deceptive, fraudulent or provide a poor user experience
How do I improve the ranking of my web site in the search results?
Generally, you can ensure you have unique and targeted content for your audience. Here are a few tips that can help your page be found by a focused search on the Internet:
* Think carefully of the key terms that your users will search on to find content like yours. Use those terms to guide the text and construction of your page.
* Users are more likely to click a link if the title matches their search. Choose terms for the title that match the concept of your document.
* Use a "description" meta-tag and write your description accurately and carefully. After the title, the description is the most important draw for users. Make sure the document title and description attract the interest of the user but also fit the content on your site.
* Use a "keyword" meta-tag to list key words for the document. Use a distinct list of keywords that relate to the specific page on your site instead of using one broad set of keywords for every page.
* Keep relevant text and links in HTML. Placing them in graphics or image maps means search engines can't always search for the text, and the crawler can't follow links to your site's other pages. An HTML site map, with a link from your welcome page, can help make sure all your pages are crawled.
* Use ALT text for graphics. It's good page design to accommodate text browsers or visually impaired visitors, and it helps improve the text content of your page for search purposes.
* Correspond with webmasters and other content providers and build rich linkages between related pages. Note: "Link farms" create links between unrelated pages for no reason except to increase page link counts. Using link farms violates Yahoo!'s Site Guidelines and will not improve your page ranking.
referensi :
google, nanangsuryadi
Google Webmaster Guidelines
Webmaster Guidelines
design and content guidelines- Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
- Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.
- Keep the links on a given page to a reasonable number.
- Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
- Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
- Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.
- Make sure that your <title> elements and ALT attributes are descriptive and accurate.
- Check for broken links and correct HTML.
- If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
technical guidelines
- Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
- Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
- Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
- Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.
- Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
- If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
- Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
- Test your site to make sure that it appears correctly in different browsers.
- Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.
google, nanang suryadi
Minggu, 11 Maret 2012
Perkembangan internet di Indonesia 2012
Perkembangan teknologi yang pesat juga diikuti oleh perkembangan internet yang juga sangat pesat. Hal ini dapat diketahui dari berbagai macam fasilitas-fasilitas yg memadai untuk menggunakan internet. Sebagai contoh yaitu laptop,handphone yang bisa mengakses internet dengan cepat. Dan juga sebagai penunjang adanya layanan paket-paket dari service provider demi kelancaran melakukan internet. Internet sendiri telah berkembang pesat, yang sekarang lebih digunakan dalam kegiatan sehari-hari, sebagai langkah mencari ilmu dan juga sebagai sarana dalam memasarkan produk dari suatu perusahaan.
Keberadaan internet yang cukup pesat langsung ataupun tidak telah merubah sendi-sendi kehidupan manusia. Perbedaan jarak dan waktu telah mendapat solusinya. Untuk mendapatkan informasi tidaklah perlu menunggu lama, dengan membuka internet cukup dengan klik, apa yang kita inginkan akan muncul seketika itu juga (real time). Cara-cara lama mulai ada perubahan dan megalami adaptasi. Yang dahulu perjumpan secara fisik saat ini bisa secara virtual, dan itu tidak mengurang subtansi dari tujuan hubungan tersebut. Internet saat ini sudah banyak dimanfaatkan untuk menunjang kegiatan secara elektonik dan online. Diantranya, cara untuk belajar (e-learning), jual beli (e-commerce) , transaksi perbankan (e-banking), dan banyak lagi yang lain.
Indonesia merupakan negara yang perkembangan internetnya cukup pesat. Internet saat ini sudah menjadi bagian dari gaya hidup bagi segala lapisan masyarakat. Keberadaan media sosial seperti Facebook, Twiter, dan blog sudah menjadi bagian dari aktifitas sehari-hari. Apalagi ditunjang dengan terjaminnya oleh negara akan kebebasan berpendapat dan berkreasi. Infrastruktur dan struktur pendukung cukup tersedia secara lengkap dan up to date. Kecepatan akses internet pun kian cepat, bahkan saat ini sudah mengarah kepada generasi ke empat dengan kecepatan download bisa mencapai 20 Mpbs. Maka tidak heran untuk video streaming dan menikmati siaran TV cukup bagus.
Indonesia juga menjadi service internet yang murah, hanya dengan membeli layanan sebesar Rp. 20.000, dapat mengakses internet dengan cepat dibanding saat memasuki awal tahun 2000,penggunaan internet di Indonesia masih mahal dan masih jarang terdapat warung internet.
sumber :
http://dumalana.com/2012/02/25/perkembangan-internet-yang-pesat-perlu-dioptimalkan/comment-page-1/
HTML 5
From Wikipedia
HTML5 is a language for structuring and presenting content for the World Wide Web, and is a core technology of the Internet originally proposed by Opera Software. It is the fifth revision of the HTML standard (created in 1990 and standardized as HTML4 as of 1997) and as of March 2012 is still under development. Its core aims have been to improve the language with support for the latest multimedia while keeping it easily readable by humans and consistently understood by computers and devices (web browsers, parsers, etc.). HTML5 is intended to subsume not only HTML 4, but XHTML 1 and DOM Level 2 HTML as well.
Following its immediate predecessors HTML 4.01 and XHTML 1.1, HTML5 is a response to the observation that the HTML and XHTML in common use on the World Wide Web are a mixture of features introduced by various specifications, along with those introduced by software products such as web browsers, those established by common practice, and the many syntax errors in existing web documents.It is also an attempt to define a single markup language that can be written in either HTML or XHTML syntax. It includes detailed processing models to encourage more interoperable implementations; it extends, improves and rationalises the markup available for documents, and introduces markup and application programming interfaces (APIs) for complex web applications. For the same reasons, HTML5 is also a potential candidate for cross-platform mobile applications. Many features of HTML5 have been built with the consideration of being able to run on low-powered devices such as smartphones and tablets. In December 2011 research firm Strategy Analytics forecast sales of HTML5 compatible phones will top 1 billion in 2013.
In particular, HTML5 adds many new syntactical features. These include the new <video>, <audio> and <canvas> elements, as well as the integration of Scalable Vector Graphics (SVG) content that replaces the uses of generic <object> tags. These features are designed to make it easy to include and handle multimedia and graphical content on the web without having to resort to proprietary plugins and APIs. Other new elements, such as <section>, <article>, <header> and <nav>, are designed to enrich the semantic content of documents. New attributes have been introduced for the same purpose, while some elements and attributes have been removed. Some elements, such as <a>, <cite> and <menu> have been changed, redefined or standardized. The APIs and document object model (DOM) are no longer afterthoughts, but are fundamental parts of the HTML5 specification. HTML5 also defines in some detail the required processing for invalid documents so that syntax errors will be treated uniforlly by all conforming browsers and other user agents.
History
The Web Hypertext Application Technology Working Group (WHATWG) began work on the new standard in 2004, when the World Wide Web Consortium (W3C) was focusing future developments on XHTML 2.0, and HTML 4.01 had not been updated since 2000. In 2009, the W3C allowed the XHTML 2.0 Working Group's charter to expire and decided not to renew it. W3C and WHATWG are currently working together on the development of HTML5.
Even though HTML5 has been well known among web developers for years, it became the topic of mainstream media in April 2010. after Apple Inc's then-CEO Steve Jobs issued a public letter titled "Thoughts on Flash" where he concludes that "[Adobe] Flash is no longer necessary to watch video or consume any kind of web content" and that "new open standards created in the mobile era, such as HTML5, will win". This sparked a debate in web development circles where some suggested that while HTML5 provides enhanced functionality, developers must consider the varying browser support of the different parts of the standard as well as other functionality differences between HTML5 and Flash. In early November 2011 Adobe announced that it will discontinue development of Flash for mobile devices and reorient its efforts in developing tools utilizing HTML 5.
Standardization process
The Mozilla Foundation and Opera Software presented a position paper at a World Wide Web Consortium (W3C) workshop in June 2004, focusing on developing technologies that are backwards compatible with existing browsers, including an initial draft specification of Web Forms 2.0. The workshop concluded with a vote, 8 for, 14 against, for continuing work on HTML. Later that month, work based upon that position paper moved to the newly formed Web Hypertext Application Technology Working Group (WHATWG), and a second draft, Web Applications 1.0, was also announced.The two specifications were later merged to form HTML5.
The HTML5 specification was adopted as the starting point of the work of the new HTML working group of the W3C in 2007. This working group published the First Public Working Draft of the specification on 22 January 2008. Parts of HTML5 have been implemented in browsers despite the whole specification not yet having reached final Recommendation status.
According to the original W3C timetable, it was estimated that HTML5 would reach W3C Recommendation by late 2010, after a Last Call in 2008. However, the First Public Working Draft estimate was missed by eight months, and Last Call was only reached in 2011.
On 14 February 2011, the W3C extended the charter of its HTML Working Group with clear milestones for HTML5. In May 2011, the working group advanced HTML5 to "Last Call", an invitation to communities inside and outside W3C to confirm the technical soundness of the specification. The W3C is developing a comprehensive test suite to achieve broad interoperability for the full specification by 2014, which is now the target date for Recommendation.
The criterion for the specification becoming a W3C Recommendation is "two 100% complete and fully interoperable implementations". In an interview with TechRepublic, Ian Hickson guessed that this would occur in the year 2022 or later. However, many parts of the specification are stable and may be implemented in products.
The WHATWG made a Last Call for its HTML5 specification in October 2009.[ Then, in December 2009, WHATWG switched to an unversioned development model for the HTML specification, effectively abandoning its HTML5 project, but kept the name "HTML5".In January 2011, following this, the WHATWG renamed its "HTML5" living standard to "HTML". The W3C nevertheless continues its project to release HTML5.
As of January 2012, the specification is in Working Draft state at the W3C. Ian Hickson of Google is the editor of HTML5.
Markup
HTML5 introduces a number of new elements and attributes that reflect typical usage on modern websites. Some of them are semantic replacements for common uses of generic block (<div>) and inline (<span>) elements, for example <nav> (website navigation block), <footer> (usually referring to bottom of web page or to last lines of HTML code), or <audio> and <video> instead of <object>. Some deprecated elements from HTML 4.01 have been dropped, including purely presentational elements such as <font> and <center>, whose effects are achieved using Cascading Style Sheets. There is also a renewed emphasis on the importance of DOM scripting in Web behavior.
The HTML5 syntax is no longer based on SGML despite the similarity of its markup. It has, however, been designed to be backward compatible with common parsing of older versions of HTML. It comes with a new introductory line that looks like an SGML document type declaration, <!DOCTYPE html>, which triggers the standards-compliant rendering mode. As of 5 January 2009, HTML5 also includes Web Forms 2.0, a previously separate WHATWG specification.
New APIs
In addition to specifying markup, HTML5 specifies scripting application programming interfaces (APIs). Existing document object model (DOM) interfaces are extended and de facto features documented. There are also new APIs, such as:
The canvas element for immediate mode 2D drawing. See Canvas 2D API Specification 1.0 specification
Timed media playback
Offline Web Applications
Document editing
Drag-and-drop
Cross-document messaging
Browser history management
MIME type and protocol handler registration
Microdata
Web Storage, a key-value pair storage framework that provides behaviour similar to Cookies but with larger storage capacity and improved API.
Not all of the above technologies are included in the W3C HTML5 specification, though they are in the WHATWG HTML specification. Some related technologies, which are not part of either the W3C HTML5 or the WHATWG HTML specification, are as follows. The W3C publishes specifications for these separately:
· Geolocation.
· Web SQL Database, a local SQL Database (no longer maintained).
· The Indexed Database API, an indexed hierarchical key-value store (formerly WebSimpleDB).
· File API, Handle file uploads and file manipulation.
· Directories and System. This API is intended to satisfy client-side-storage use cases not well served by databases.
· File Writer. An API for writing to files from web applications.
HTML5 alone cannot provide animation within web pages. Either JavaScript or CSS3 is necessary for animating HTML elements. Animation is also possible using JavaScript and HTML 4, and within SVG elements through SMIL, although browser support of the latter remains uneven as of 2011.
XHTML5
XHTML5 is the XML serialization of HTML5. XML documents must be served with an XML Internet media type such as application/xhtml+xml or application/xml. XHTML5 requires XML's strict, well-formed syntax. The choice between HTML5 and XHTML5 boils down to the choice of a MIME/content type: the media type one chooses determines what type of document should be used. In XHTML5 the HTML5 doctype html is optional and may simply be omitted. HTML that has been written to conform to both the HTML and XHTML specifications—and which will therefore produce the same DOM tree whether parsed as HTML or XML—is termed "polyglot markup".
Error handling
An HTML5 (text/html) browser will be flexible in handling incorrect syntax. HTML5 is designed so that old browsers can safely ignore new HTML5 constructs.In contrast to HTML 4.01, the HTML5 specification gives detailed rules for lexing and parsing, with the intent that different compliant browsers will produce the same result in the case of incorrect syntax. Although HTML5 now defines a consistent behavior for "tag soup" documents, those documents are not regarded as conforming to the HTML5 standard.
Popularity
According to a report released on 30 September 2011, 34 of the world's top 100 Web sites were using HTML5 – the adoption led by search engines and social networks.
Senin, 05 Maret 2012
membuat backlink
Sekarang belajar membuat backlink untuk artikel seo : informasi kredit terbaik di indonesia
Ecommerce 5maret2012 kelas internasional
Hari ini saya mulai belajar Ecommerce diajar Pak Nanang dan Mbak Risca. Teman disamping saya adalah Devi. Kami belajar membuat blog dari blogger.com, wordpress, dan blog UB. Selain itu kami juga belajar tentang google, yahoo, dan email Universitas Brawijaya.
Langganan:
Postingan (Atom)