Figure 1. Wireframe of 3D model.

In our second Shader Adventure we will use plane’s normal to create basic holographic effect. We start with a super simple shader that paints 3D model’s planes according to a value of dot product between a plane normal and a view direction. Planes orientated parallel to the view direction, (these planes in our model are located in the middle part of Fig. 1.) will be painted by white colour. Planes, which in fact could not be seen, oriented perpendicular to the view direction will be marked by red colour. Any orientation of a plane between 0 deg and 90 deg will yield reddish colour. We will make that the green and blue channels values will depend on angle between a normal vector of plane and a view direction. And that will produce reddish colour of planes. To demonstrate this effect I will use model presented on Fig. 1. The triangles represent model planes.

So go to github and grab the  base shader or generate in Unity3D the UnLit shader (2017.2.1p4 version of Unity3D I am currently using). First let’s make change to shader code and after that we will discuss the effects of this changes. Start with:

 - in AppData struct add: float3 normal : NORMAL;
 - in v2f struct add: float3 worldNormal : NORMAL;
 - in vertex shader add: worldNormal = UnityObjectToWorldNormal(v.normal);

That makes possible to access the normal vector value in our fragment shader. However we still need the view direction for each fragment. The view direction depends on a vertex position and the position of viewer. We will calculate it for each vertex in the vertex shader by adding these lines to our shader:

 - in v2f struct add: float3 viewDir : TEXCOORD1;
 - in vertex shader: float4 worldVertex = mul(unity_ObjectToWorld, v.vertex);
 - in vertex shader: viewDir = normalize(UnityWorldSpaceViewDir(worldVertex.xyz));

Let’s explain a bit. First of all transformation of a vertex position from the object to the world space is done. Then, we use the world position of the vertex to calculate a view direction for the vertex. And now we are ready to move to a fragment shader. This is a code of my fragment shader body so copy and paste it:

fixed4 frag(v2f i) : SV_Target
{
   float VNdot = saturate(dot(i.viewDir, i.worldNormal));
   fixed4 col = fixed4(1, VNdot, VNdot, 1);                   
   return col;
}

Figure 2. Dot product of view direction and plane’s normal.

On Fig. 2. you can see the shader in action. How it works? We start by computing a dot product of a view direction and a plane’s normal. The dot product of two vectors yield a scalar, which is telling us about the angle between these two vectors. Moreover when these two vectors are normalized (vector length equal 1), then the dot product is equal the cosine of the vectors angle. Hence we normalized the viewDir in our vertex shader. From trigonometry we know, that cosine value vary between -1 to 1 in way that if the angle is 0 deg the value of cos(0) = 1, if angle = 90 deg the cos(90) = 0 and if angle 180 deg cos(180) = -1.

This can be translated as follow:

  • dot product of two parallel vectors pointing the same direction is 1, check the Fig. 2. for white planes located in the middle of image. The view direction vector and plane normal are pretty much parallel, just pointing at you,
  • dot product of perpendicular vectors is 0, on Fig. 2 these are the cases when planes are red.
  • dot product of two parallel vectors pointing opposite direction is -1. These are the back side planes of the model. As negative values do not have any sense for colour definition I am using saturation function, which clamp the value of dot product between 0 and 1. We will play with this planes later on and explore the culling option in shaders as well.

Figure 3. Shader with transparency.

Finally the get red and white colours of fragments the green and blue channels are set according to dot product value.

That is already a nice effect, but let’s make it transparent. To correctly render transparent material, make fallowing changes to your shader code:

 - in Tags change "RenderType" to "Transparent"
 - in Subshader add Blend SrcAlpha OneMinusSrcAlpha

And then add this line to your shader code:

 - in fragment shader add a = 1-VNdot;

In this case the front planes will be fully transparent and the perpendicular planes will be opaque. The effect is pretended on Fig. 3. It is hard to see the effect of transparent planes as the background has solid colour but is there believe me. As far this looks already pretty cool, there is small thing I would like to add. At this moment we are not rendering the back side of the mode. This is due to the fact that in Unity default setting culling is Back. In result planes which are facing away from us are not render. If you want learn more about it visit the Unity3D documentation page about ShaderLab: Culling & Depth Testing. We have three option for culling: back, front and off. And now we will switch to front culling to render only the back side of the model.

Figure 4. Inner planes of model back side.

So add to your shader code:

 - in Subshader Pass add Cull Front

And as soon as you do that, the all model turn into red hence we saturate (clamp) our dot product. So instead of saturation, we will use abs function (absolute value):

 - in fragment shader change VNdot for float VNdot = abs(dot(i.viewDir, i.worldNormal));

The effect is shown on Fig. 4. This is the back side of the model. Why we see more planes (triangles) is a mystery for me. I need to check it in my model. However, to render both sides on the screen, we need create two Passes in our shader, one for back side and the second one for front side. To do so duplicate whole Pass body and in the first set Cull Front and in the second Cull Back. Yet we will make a small change to make the model looks nicer. In the first Pass (Cull Front) change the value of alpha channel for:

 - in fragment shader change col.a for col.a = VNdot*0.1;

Figure 5. Rendered model with two passed for back and front culling.

Now the planes aligned with view direction are more opaque than the perpendicular one. Additionally the alpha channel is strongly attenuated just to make a slight visual effect that there is something there. The effect of this last change is presented on Fig. 5.

And that pretty much it for this shader. To finalise we will make few improvements, like moving some hard code value to properties so it will be easier to mess with in Unity inspector. In Cull Front Pass change:

 - in fragment shader change a for col.a = VNdot * _BackIntensity;

_BackIntensity is now a property of the shader and can be access from Unity inspector. In the second pass we will make more changes, as we will separate main colour and rim colour. To make life easier below is the final code of the fragment shader in the second pass (cull back):

fixed4 frag(v2f i) : SV_Target
{
       float VNdot = abs(dot(i.viewDir, i.worldNormal));
       float rim = pow(1 - VNdot, _RimIntensity);

       fixed4 col = _MainColor + _RimColor*rim;
       col.a = rim + _FrontIntensity;
       return col;
}

Now the finial fragment colour is a mix of _MainColor and the colour of rim light. Another modification makes the rim value depends exponentially on a dot product instead of linear. This will impact the most a colour of planes oriented somewhere in the middle between 0 deg and 90 deg. I will definitely write a blog entry about different types of mathematical functions and what kind of effect they can create. Now I am using the exponential function to make a rim light impacts only the most perpendicular planes.

Thanks for reading, go to github to get the final version of shader if you want. And watch the video below to see what we have done today. Thanks.

I decided to start this journey by visiting a normal vector. Just to be sure we are all on the same page a few words about the normal vector. In 3D space a vector which is orthogonal (perpendicular) to a plane, it is called a normal of this plane. This means that the normal vector defines the plane orientation is 3D space. There is massive number of online resources about this subject, so if you want learn more just google it. But now we just move to coding.

I will start with my base shader. It is just very simple vertex and fragment shader. You can download it from my github Base Shader or in Unity3D (I have done that for Unity3D 2017.2.1p4)  hit Create/Shader/Unlit Shader. Then open this shader and (i) remove lines related to a fog and (ii) removed the shader level of detail (LOD 100) line. Of course this is a quite boring shader. It will render a white flat object only. However it is perfect starting points, and we will build from it.

First we will explore normals in an object space and present them with RGB colours. The good news is that each vertex has already an information about normal vector, so we need only access it. To do so we will make following changes in our shader code:

 - in AppData struct add: float3 normal : NORMAL;
 - in v2f struct add: float3 normal : NORMAL;
 - in vertex function add: o.normal = v.normal;

Now we can access the normal data in our fragment shader. To show normal values, we will draw planes having RGB colour according to theirs space orientation, namely planes which are oriented along x-axis in red colour, y-axis in green and z-axis in blue. To make it happen, frag function needs to change:

 - in fragment shader comment: texture sampling line of code
 - in fragment shader add: fixed4 col = fixed4(abs(i.normal.r), abs(i.normal.g), abs(i.normal.b), 1);

Figure 1. Normals in Object Space

On Fig. 1. we can see the effect of this shader on a box model. Because the values of normals are taken from the Object Space, colours of box faces will not change under box rotation. Of course we can also see how normals for this model look like in World Space. To do so we will use Unity3D predefined function:

 - in v2f struct add: float3 worldNormal : TEXCOORD1;  // field for our new normal data
 - in vertex shader add: o.worldNormal = UnityObjectToWorldNormal(v.normal); // unity helper function to compute world space normal
 - in fragment shader replace normal by worldNormal:
fixed4 col = fixed4(abs(i.worldNormal.r), abs(i.worldNormal.g), abs(i.worldNormal.b), 1);

Figure 2. Normals in World Space

If your shader is attached to any object, you should see the difference immediately. If not try to rotate object, to be sure that the Object Space is not perfectly aligned with the World Space. The shader impact on my box model is presented on Fig. 2. Now the colours are related to World Space as  expected. Since the object is not perfectly aligned with the World Space coordinate system the RGB channels of faces colours are mixed.

Can we make something funny with this? Of course we can :). Let’s use normal values as a mask. To make it much funnier I will use one of my B&W photo as a texture (Fig. 3). So change fragment shader as follow:

 - in fragment shader uncomment: texture sampling line fixed4 col = tex2D(_MainTex, i.uv);
 - in fragment shader calculate mask value: float mask = step(0.5, abs(i.normal.b)); // you can use worldNormal as well. 0.5 is threshold value you can put it to shader properties for easy testing
 - in fragment shader multiply colour by mask: col *= mask;
 - in fragment shader keep alpha channel equal 1: col.a = 1.0;

Figure 3. Object with Texture

Fig. 3. shows the model of box when the effect is not applied. And video bellow shows the result of the shader for both the World and Object Space normal. You can see that faces of box which are not aligned along z-axis is drawn black instead of taking values from texture. And that is all for now. Thanks for reading.

Recently I invested a bit of my time to learn how to write Shaders for Unity3D. I read books, watched tutorials, took online course etc. There are truly amazing teachers out there. But writing shaders is not only about knowledge it is actually mostly about the practice.

So I decided to start a small project, where I will experiment with Shaders. Sort of travel blog, but instead of exploring real world, we will wander in 3D computer graphics shaders space. Hopefully this will be useful for you and just to let you know I am still learning how to program shaders so please feel free to correct me if you find any mistake.

I have started github repository, where I will keep shaders built for this project. So in many cases I will not show the full shader code here. Instead I’d like to focus on important for a blog post parts of shaders. But I will make everything as easy as possible so you can replicate it.

 

GitHub Shaders Library

English version and some details below the image keep scrolling!

Inne spojrzenie na moją skromną kolekcje twitów z okresu 06-08 2017.

Wizualizacja przygotowana przy użyciu analizy grafów. Każdy punkt reprezentuje użytkownika twittera. Sięć połączeń jest wygenrowania na podstawie hashtagów. Użytkownik, który publikował twitta zawierającego danego hashtaga jest połączony z każdym użytkownikiem, który również publikował z użyciem tego hashtaga.

W rezultacie, użytkownicy publikujący przy użyciu wielu hashtagów będą łączeni z wieloma użytkownikami. Na wyżej przedstawionej ilustracji, ci użytkownicy znajdują się w środkowej części grafu, gdzie bardzo ciężko wyróżnić odizolwane klastry punktów.

Po drugiej stronie tego spektrum, są użytkownicy którzy  używają tylko ograniczonej ilości hashtagów w swoich twittach. W tym przypadku powstaną odizolowane grupy użytkowników, którzy komunikją się tylko wewnątrz danego hashtaga. Na zamieszczonej ilustracji, przykładem takich grup są klastry: #wtylewizji i #woronicza17.

 

Wykorzystane narzędzie :

  • Python, Numpy, Pandas,
  • Gephi, www.gephi.org

 

English version

The second look at my twitter data. This time I used the graph network analysis to see connections between twitter users. Each node in the graph represents a twitter user, the edge between users is created if two users publish something using the same hashtag.

If someone uses many hashtags on regular basis it will be connected to many others twitter users. This is the case for node (users) in the middle of presented graph, which are packed together and it is hard to isolated them from others.

On the other side, there are users who publish usually for one or small number of hashtags. These users will create sort of clusters, as the communication happen only between them. See #woronicza17 and #wtylewizji group as an example.

 

Tool used to get the results :

  • Python, Numpy, Pandas,
  • Gephi, www.gephi.org

English version and some details below the image keep scrolling!

Lista tagów zawartych w tweetach zebranych w okresie wakacji (Czerwiec – Sierpień) w roku 2017. Tweety były filtrowane słowem ‘polityka’.

Clustering twitter data is not easy. I know that now :).  A single tweet is mostly a noise. So at the end you will get a sparse matrix of corpus with density far below 1%.  Simply if all corpus contains around 50 000 words (dimensions) but a single tweet has only 10 words (and probably only one or two has some meaning). Lots of noise and sparsity. And finally clustering like k-means put all tweets to one big cluster.

So I decided to change approach and try get something different from collected data.

On my first attempt was to try make words cloud with hashtags.

To do it I used nice python module:

  • website: http://amueller.github.io/word_cloud/
  • blog: http://peekaboo-vision.blogspot.co.uk/2012/11/a-wordcloud-in-python.html

This image was done on twitter data collected during June, July and August in 2017. The data was filter by polish word ‘polityka’.

In short: This post describes the process of a data collection and gives a bit insight about my text processing algorithm.

 

A bit of time has elapsed since my last post about twitter data mining. The reason is quite simply… recently I spent most of my free time working on a prototype of my new game. I am quite excited about it and hopefully in the near future I will announce more. Anyway back to Twitter Data Mining.

Recently I have been collecting a polish tweets which are contain the word ‘politics’ or some declination of this this word. Namely : ‘polityka‘, ‘polityce’, ‘polityk’, ‘politycy’, ‘polityke’, ‘polityczna’, ‘politologia’.

To access the Twitter API and get tweets to my computer I use tweepy Python library (http://www.tweepy.org/).

Then collected tweets are processed by my Tweet Processor into pandas data frame. (http://pandas.pydata.org/)

I am improving the Tweet Processor gradually as soon I learn something new about my data set. At this moment Processor is quite simple and can perform really basic things, mainly:

  • soft clean of a tweet text – for example removing https links, html tags, user tags etc.
  • politics groups name extraction – this maps politics group name declinations into one name,
  • stop words removal – this is still in progress and my list of stop words is probably too small right now. I will post someday about my polish language stop words,
  • stemming  – this is the weakest part of my Text Processor so far. At this moment I map just a few words with its declinations. Polish language is not easy for this task and there is plenty of room to improvement.

As most of this it is just at the begging of development the output data are quite noisy. However instead of trying to make it perfect in first go. I decided to work in a loop:

This will allow me to focus on the most important issues and I will learn massively about text analysis.

Next time I will describe more about my clustering approach.

Thank for you time.

English version is a bit lower keep scrolling!

Pierwsze i bardzo podstawowe rezultaty analizy tweetów na temat polityki. Analiza obejmuje tweety zebrane wciągu jednego miesiąca. Na temat metody analizy i dalszych planów będę pisał więcej w najbliższych postach.

 

English version!

My first and really basic results of a twitter data mining project. The data set includes an one month collection of tweets. The tweets were filter by polish word ‘politics’ and declination of it. In the next couple posts I will talk more about a technique to get this data and the future goal for this project.

Page_001_EN