Wednesday, April 30, 2008

Week 8 - In Class Exercise

Studio Session: Server Side Authentication & Includes

Today we'll work on server-side authentication and includes.

On many web servers, you can create your own user and password file to restrict access to a web directory. On Gibson, access control is managed not via a user-created password file but instead using the campus-wide authentication system. If you password protect a directory using this method, your page(s) will have to be referenced using the https protocol, i.e. https://people.rit.edu/~abc1234/protected_directory/

To limit access to any and all RIT users with valid (DCE) login, use the following .htaccess syntax:

AuthType Basic
AuthName "RIT"
AuthBasicProvider ldap
SSLRequireSSL
AuthLDAPUrl ldaps://ldap.rit.edu/ou=people,dc=rit,dc=edu?uid?sub
AuthzLDAPAuthoritative off
require valid-user

To limit access to only specific RIT user(s), use the following .htaccess syntax, placing the usernames of the users who should have access (e.g. ellics or abc1234) in place of the "username1", "username2" examples.

AuthType basic
AuthName "Your Description Here"
AuthBasicProvider ldap
SSLRequireSSL
AuthLDAPUrl ldaps://ldap.rit.edu/ou=people,dc=rit,dc=edu?uid?sub
require ldap-user username1 username2


Readings on SSA and SSI


Error documents

“ In order to specify your own ErrorDocuments, you need to be slightly familiar with the server returned error codes. (List to the right). You do not need to specify error pages for all of these, in fact you shouldn't. An ErrorDocument for code 200 would cause an infinite loop, whenever a page was found...this would not be good” http://www.javascriptkit.com/howto/htaccess2.shtml

Edit your .htaccess file pico .htaccess

ErrorDocument 404 http://people.rit.edu/~jrhicsa/notfound.html

Ctrl x – Save your file, I suggest your copy it cp .htaccess htaccess

Wednesday, April 9, 2008

Week 5 Apirl 9

DOM, Javascript

This is where we switch gears from design to implementation. We'll start with client-side programming using JavaScript, which is used for everything from simple rollovers to complex calculators.

To make JavaScript work consistently, you need to understand the Document Object Model, or DOM, that the browser uses to represent objects to be manipulated (images, form fields, etc).

Readings on DOM and Basic Javascript

* Thau's JavaScript Tutorial, Lessons 1-3

* You know that some tags are inside of others in an HTML webpage. In terms of a tree diagram, those tags would be considered "children" of the first tag. Take a look at this LINK and sample code

Reference Sources:

* DevGuru Javascript Quick Reference Guide
* Visibone Javascript Reference

Images
* DOM
* HTML Tree
* Widow Document HTML

DOM Tutorials

DOM stands for the Document Object Model, and it's a way of representing a document using Object Oriented programming. It basically allows you to have access to any part of the document you want by calling functions. Within the Document Object Model (DOM), all page elements are placed in a treelike hierarchy. Every HTML tag is a node within this tree, with subnodes and parent nodes. Also, every text portion is its own DOM node. Dynamic HTML, an extension of HTML that enables, among other things, the inclusion of small animations and dynamic menus in Web pages. DHTML code makes use of style sheets and JavaScript. Learn what is Dynamic HTML (DHTML), how it can be applied to web page elements and how you can use it to create interactive Web pages.

Monday, April 7, 2008

Week 5 & Week 6

Midterm is a project and description can be found in MyCourses

Week 6 - you should be working with your group members to get ready for your class presentation on Monday April 21 - There will be no "Official" class Week 6 April 14 & 16

Week 5 lecture goal is Acceptability & Usability and robot.txt

Web accessibility refers to the practice of making websites
usable by people of all abilities and disabilities.

Using JAWS to Evaluate Web Accessibility
http://www.webaim.org/articles/jaws/#intro


http://www.webaim.org/intro/video.php




The Web Robots Pages

Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.

/robots.txt is, and how to use

Wednesday, April 2, 2008

Group Project Posting - Students please post a comment

The first group assignment will be to create a design document for the site, using the guidelines provided in the Webmonkey Information Architecture Tutorial.



Please Post as a comment to this Blog entry. Your posting should include you and your partner’s name & email address of each partner. Also include the URL and the client’s contact information of the site you are going to redesign. Please only one posting per group!

Add a Short description of the project i.e. Redesign - Create a new site - Include how or why did you pick this site.

Please provide as much contact information as you can about the client
Client's name
Client's phone number
Client's email address
Current client's URL


April 21 Design Document Due (Draft 1, required index page and About page, navigation) You have to present to the class your Draft 1 of your desgin document)