What do you think of pre-compiling your CSS?

Compare to regular CSS

I found it took awhile to get into the way of writing CSS in the pre-compiling way compare to the regular way. Probably most due to the fact that I spent quite som time writing regular CSS in the HTML/CSS course at the start of the term. I do really see the advantage of it though, and being able to write shorter and less code.


Which techniques did you use?

I used the scss files that we got with the minima theme as a base and made changes from that to customise my page. I used knowledge I had of regular CSS to change things in this new format. Just having to change a color or margin-size in one place compared to several was a big change, but I think I got the hang of it evetually.


Pros and cons?

The pros of pre-compiling CSS is that it makes it much easier to give a theme to the website that runs through the whole site. This applies not just to the headlines for example, but also to be able to use it to set margins or text size throughout and then just be able to include it for specific elements.


What do you think of static site generators?

I think static site generators are easy and fun to work with. They make a lot of things much easier than writing regular html code. I also think being able to use markdown files for things like these blog posts results in much cleaner and easier to read and write code. I much prefer this than writing text in a p-element.

Personally, I also found this concept easier to understand and get into than the pre-compiling CSS. (Not to mention open graph.)


What type of projects are they suitable for?

They are definitely suitable for projects like this, creating a fairly simple website with blog posts and some basic information.

I can also think they are great for creating easy and clean website to present a company or a person, where not much interaction or functionality is needed past giving information or being able to comment and maybe discuss using some sort of forum.


What is robots.txt and how have you configure it for your site?

Robots.txt is a simple textfile that is used to allow or disallow robots to access information on your site. Here you can specify if it is supposed to apply to all robots as well as what content you want to for example disallow.

I chose to have a very simple robots.txt file that applies to all robots and disallowsfiles they should not access.


What is humans.txt and how have you configure it for your site?

Humans.txt is a simple textfile that is used to tell information about the site. This is where you can find information and contact details for the creator(s) of the site as well as whatever other information you want to give about the site. Other information can be when the site was last updated and what languages it is written in.

My humans.txt is quite short since it only contains information about myself and the site. I chose to keep it like this since there was not much more information to give.