• No results found

Developing a process for automating UV mapping and polygon reduction

N/A
N/A
Protected

Academic year: 2021

Share "Developing a process for automating UV mapping and polygon reduction"

Copied!
33
0
0

Loading.... (view fulltext now)

Full text

(1)

Linköping University | Department of Computer and Information Science Bachelor’s Thesis | Computer Science Spring Term 2016 |LIU-IDA/LITH-EX-G--16/063—SE

Developing a process for automating

UV mapping and polygon reduction

Julius Willén

Tutor, Ivan Ukhov

(2)

Upphovsrätt

Detta dokument hålls tillgängligt på Internet – eller dess framtida ersättare – under 25 år från

publiceringsdatum under förutsättning att inga extraordinära omständigheter uppstår.

Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka kopior

för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervisning.

Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan

användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten,

säkerheten och tillgängligheten finns lösningar av teknisk och administrativ art.

Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som

god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet

ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för

upphovsmannens litterära eller konstnärliga anseende eller egenart.

För ytterligare information om Linköping University Electronic Press se förlagets hemsida

http://www.ep.liu.se/.

Copyright

The publishers will keep this document online on the Internet – or its possible replacement – for a

period of 25 years starting from the date of publication barring exceptional circumstances.

The online availability of the document implies permanent permission for anyone to read, to

download, or to print out single copies for his/hers own use and to use it unchanged for

non-commercial research and educational purpose. Subsequent transfers of copyright cannot revoke this

permission. All other uses of the document are conditional upon the consent of the copyright owner.

The publisher has taken technical and administrative measures to assure authenticity, security and

accessibility.

According to intellectual property law the author has the right to be mentioned when his/her work

is accessed as described above and to be protected against infringement.

For additional information about the Linköping University Electronic Press and its procedures for

publication and for assurance of document integrity, please refer to its www home page:

http://www.ep.liu.se/.

(3)

Link¨oping University

Abstract

An exploratory research project was conducted through a company focusing on CAD and their own developed real-time 3D model viewer. The company needed to be able to convert CAD models to use in Unreal Engine with great visual quality. Before this project, another was conducted to perform the simple conversion of CAD models to the FBX file format, which Unreal uses. In ex-tension to the previous project, one needed to add functionalities to manipulate the models for better quality and performance. The tasks were carried out and performed with good results.

(4)

Link¨oping University

Acknowledgements

I would like to thank XperDi and all the people working there, more specifically; Mehdi Tarkian, Leon Poot and Manokar Munisamy, along with the students that performed their master’s thesis during the spring of 2016.

(5)

Contents

1 Introduction 6 1.1 Background . . . 6 1.1.1 Research questions . . . 7 1.2 Methodology . . . 7 1.2.1 Analysis . . . 7 1.2.2 Program design . . . 7 1.2.3 Coding . . . 8 1.2.4 Testing . . . 8

1.3 The file formats . . . 9

1.3.1 STEP . . . 9 1.3.2 Wavefront’s .obj . . . 9 1.3.3 FBX . . . 10 2 UV mapping 11 2.1 Prestudy . . . 11 2.2 Method . . . 11 2.2.1 Blender . . . 12 2.3 Implementation . . . 13 2.4 Results . . . 14 3 Polygon reduction 15 3.1 Prestudy . . . 15 3.2 Theory . . . 15

3.2.1 Calculating the cost . . . 16

3.2.2 Collapsing the vertices . . . 16

3.3 Implementation . . . 19

3.4 Results . . . 21

3.4.1 Testing method . . . 21

3.4.2 The Stanford bunny . . . 21

3.4.3 Visual differences . . . 22

(6)

CONTENTS Link¨oping University 4 Discussion 25 4.1 Results . . . 25 4.1.1 UV mapping . . . 25 4.1.2 Polygon reduction . . . 25 4.2 Method . . . 26

4.2.1 Replicability, Reliability and Validity . . . 26

4.3 The work in a wider context . . . 26

4.4 Conclusion . . . 27

4.4.1 Answer to the first research question . . . 27

4.4.2 Answer to the second research question . . . 27

4.4.3 Answer to the third research question . . . 27

4.4.4 Future work . . . 28

List of Figures 31

(7)

Glossary

Blender Is an open source 3D editing software and game engine. 11

C++ Is a cross platform programming language developed by Bjarne Strous-trup in 1979. 8

CAD Is a type of computer program that allows for the design of models and their documentation. 6

face Is a number of connected vertices, in this thesis a face is equal to a triangle. 15

mesh Is the outer bounds, or surface, of a 3D model which consists of vertices, edges and triangles. 15

Microsoft Visual Studio 2013 Is an Integrated Development Environment for aiding software developers in their programming. 8

Notepad++ Is an advanced text editor to be used for manipulating and de-veloping code in multiple languages. 8

polygon Is the building block of 3D modeling. 15

Python Is a dynamic programming language developed by Guido van Rossum. 8

Roadkill UV tool Is an open source 3D editing software. 11

Unreal Engine 4 Is a version of the game engine called Unreal Engine, devel-oped by Epic Games. 6

vertex Is a geometrical point consisting of the coordinates x, y and z. 15 vertices Is the plural form of vertex. 15

(8)

Chapter 1

Introduction

1.1

Background

XperDi is a start-up company that is in the development phase of two software tools; a CAD configurator and a sales configurator. The CAD configurator is a tool, or plug-in, for already established CAD programs such as Creo or Solidworks, which helps users to create extensive models of different types and configurations quickly and efficiently. These models can then be viewed in the sales configurator, which is developed with Unreal Engine 4, to show them in a more photo realistic environment than what is commonly used in CAD programs, with different textures and lighting. Since the standard CAD file types are not supported in Unreal, there has to be a conversion which allows the sales configurator to actually be able to show the models that the user has created. This conversion has been developed as another bachelor’s thesis in the fall of 2015, by Rasmus Siljedahl at Link¨oping University. The product of his thesis work is a software called StepImporter, which is a proof of concept that shows that the conversion is possible to automate using different open source tools.

Figure 1.1: StepImporter

(9)

CHAPTER 1. INTRODUCTION Link¨oping University

StepImporter works as intended, and the models from the CAD configurator can, after being converted, be viewed in the sales configurator. However, the models are missing one important component: a UV map, which is basically a two dimensional image on which you apply the texture of the model in question. Without a UV map, textures will not show properly. UV maps are generally created manually in a 3D editing software.

Since the users of the CAD configurator often create models which include up to hundreds and sometimes thousands of parts, a way to optimize the models for being viewed in a real time environment must be investigated, since it is not currently feasible for a game engine to show models of that significance with all the different textures and lighting.

1.1.1

Research questions

In this thesis, I wanted to answer the following questions:

1. How can one automate the process of creating UV maps for an arbitrary amount of models?

2. Is there a way to efficiently optimize the models with regard to perfor-mance during run-time?

3. Would it be best to use proprietary, open source or self written software to answer the previous questions?

1.2

Methodology

While being inspired by the waterfall method [9], it is clear that some of the steps included are unnecessary based on the scope of this thesis. There are four general steps I used from the waterfall method; analysis, program design, coding and testing.

1.2.1

Analysis

Since the project revolved around two general areas of research, the analysis in this thesis consists of two different prestudies which are described in chapters 2 and 3. In the prestudies, it was essential to have clear requirements of what to look for. There were not any revisions of the analysis.

1.2.2

Program design

Given that StepImporter is an already established proof of concept, it would be difficult not to inherit most of the available functions in the software. The task at hand was less about designing functions available to the users, than it was about implementing functionalities that work ”under the hood”. It is also important to know that StepImporter was to be used as a type of testing platform for the

(10)

CHAPTER 1. INTRODUCTION Link¨oping University

implementations I produced during the project, but the company ultimately didn’t want three different stand alone tools in its chain, so the functionalities of this project were to be integrated as an export option in the CAD configurator. What this means is that the resulting software of this thesis was to be built into a class library for use as a functionality in the CAD configurator, and the user interface of the functionality was, in the end, designed by a developer at the company.

1.2.3

Coding

Most of the coding was to be done with either Microsoft Visual Studio 2013 for C++, or Notepad++ for Python. To generate a script that would work as a solution, I wanted to find a way to complete the tasks manually before I tried to automate the process.

1.2.4

Testing

To test the process, I set up a couple of requirements on the performance and quality of the models that went through the pipeline:

1. The time it takes for a model to be processed should be reasonable 2. The overall process of applying the UV maps and optimizing the models

should be automatic in its entirety

3. The visual quality of the models should be higher than they were with the original StepImporter software

When I mention time, one needs to understand that it must be proportional to the size of the models in question. A large production company that focuses on large vehicles would probably consider a few hours of rendering a complete model to be acceptable, given that they also possess high performing worksta-tions on which, presumably, a similar solution would work faster compared to how it performs on the computer I tested the solutions on.

The second requirement is implied by the research question, but should nevertheless be part of the testing. If something hinders the solution to be entirely automatic, I would work to remove that obstacle in the next iteration. While the quality of a model is rather subjective, it is easy to compare the visual quality of two models where one comes with a UV map and the other one does not, so passing the third requirement would be easiest in relation to the other two, given that an answer to the related research question was found. In contrast to the UV mapping scripts, this requirement proved to be hard to pass for the optimization scripts.

The user in the tests was almost exclusively myself, but it was implied that the company would look to see if the requirements were met, given that they followed my process throughout the project.

(11)

CHAPTER 1. INTRODUCTION Link¨oping University

1.3

The file formats

While it is not essential for the thesis, I think that a small description about each of the three file formats for the 3D models will clear up a couple of things.

1.3.1

STEP

STEP (.stp) is an ISO 10303 standard format for representing product manu-facturing information. The format is widely used in CAD programs, both for importing and exporting models. The .stp format used in StepImporter is of version AP203 and is an ISO 10303-21 standard. The file includes a header that contains information about the creator of the model and from which directory it has been exported, and a data section which contains all the geometrical data and information about the related parts, or products. It is important to know that in this project, there is not any manipulation done to any .stp files. The conversion from .stp to .obj comes as is from the StepImporter software.

1 ISO-10303-21; 2 HEADER;

3 FILE_DESCRIPTION((’CATIA V5 STEP Exchange’),’2;1’); 4

5 FILE_NAME(’E:\\filer\\KUBCYL1.stp’,’2016-02-04T15:26:12+00:00’,(’none’),(’none’), 6 ’CATIA Version 5 Release 20 GA (IN-10)’,’CATIA V5 STEP AP203’,’none’);

7

8 FILE_SCHEMA((’CONFIG_CONTROL_DESIGN’)); 9

10 ENDSEC;

11 /* file written by CATIA V5R20 */ 12 DATA;

13 #5=PRODUCT(’KUBCYL’,’’,’’,(#2)) ; 14 #56=PRODUCT(’KUB’,’’,’’,(#2)) ; 15 #231=PRODUCT(’Cylinder’,’’,’’,(#2)) ;

Listing 1: STEP file snippet

1.3.2

Wavefront’s .obj

The .obj format is developed by Wavefront Technologies and gives users a simple way of describing and manipulating 3D objects manually. A very simple mesh is represented only by vertices and faces, but the file format allows for many options, such as tessellation, different levels of detail and ray tracing, among many others [8]. The manipulation in chapter 2 and 3 is done on the models when they are in this format.

(12)

CHAPTER 1. INTRODUCTION Link¨oping University 1 v 10 50 30 2 v 11 33 22 3 v 25 37 39 4 5 o Mesh_name 6 7 f 1 2 3 8 f 3 1 2 9 f 2 3 1

Listing 2: .obj file example

1.3.3

FBX

The FBX file format (short for Filmbox) is developed by Autodesk and is cur-rently used in many 3D editing tools, such as Blender, 3ds Max and Maya. When the models in this project are in the .fbx format, they are in their final state, ready to be imported to the sales configurator.

1 NodeType: SomeProperty0a, SomeProperty0b, ... , { 2 3 NestedNodeType1 : SomeProperty1a, ... 4 NestedNodeType2 : SomeProperty2a, ... , { 5 ... Sub-scope 6 } 7 8 ... 9 }

Listing 3: .fbx snippet from [10]

(13)

Chapter 2

UV mapping

2.1

Prestudy

To investigate the first question in section 1.1.1, I made a brief prestudy to find out how to apply a UV map to a model manually, and which software products that were available for users to do so. The requirements I had for the tools were:

1. The software should be open source

2. There should exist a function within the software that creates a UV map automatically

3. The license of the software should be as free as possible with the BSD license [11] being the best one. Proprietary programs are not interesting at this stage.

4. There should exist an active community around the software

Due to the first requirement, there were not many options at all. The ones I chose from based on that requirement were Blender and Roadkill UV tool. However, since I could not find the function mentioned in the second requirement in Roadkill, I chose Blender. Even if there were a ”one-click” function for UV mapping in Roadkill, its community is close to being non-existent, where as Blender’s community is relatively big and active. While the license of Blender is not the BSD license, it is perfectly sufficient (GNU GPL [2, 3]).

2.2

Method

Due to the community of Blender being relatively big and active, it proved to be easy to get information that was somewhat related to the task. For a more general view of the functionalities, I used a book written by Tony Mullen [7]. The general method of answering the related research question in this chapter was to first find a way to get Blender’s own Python interpreter to do what was

(14)

CHAPTER 2. UV MAPPING Link¨oping University

necessary to complete the task for one model. When this was achieved, it was essential to find a way to use Blender externally, without the need of actually executing a session of the program itself.

2.2.1

Blender

To be able to control Blender with stand-alone script, it can be built into a Python module. It is important to know that, according to the Blender wiki [5], the functionality of building Blender as a Python module is only experimental as of now, and is not officially supported.

Figure 2.1: UV mapping menu in Blender

An important feature of Blender is that it works hand in hand with its own Python interpreter, and all the buttons and functions inside the Blender environment have related tags on mouse-over, as seen in figure 2.1, which says what function in the Python module is being called upon Smart UV Project being pressed.

To build Blender as a Python module, the steps in the Blender wiki [5] were used. In this thesis, the version of Blender was 2.76.

(15)

CHAPTER 2. UV MAPPING Link¨oping University

2.3

Implementation

With Blender as a Python module, one can access all the functionalities of the software through simple or complex scripts. The following script is the result of testing Blender’s own Python interpreter together with a model containing a few different parts. The goal of the script is to apply a UV map to all the unique meshes in a single .obj file.

1 import bpy 2

3 def clean_up():

4 bpy.ops.object.mode_set(mode = ’OBJECT’) 5 bpy.ops.object.select_by_type(type = ’MESH’) 6 bpy.ops.object.delete(use_global = False) 7

8 for item in bpy.data.meshes:

9 bpy.data.meshes.remove(item) 10

11 def uv_map():

12 for o in bpy.data.objects:

13 if o.type == "MESH":

14 bpy.context.scene.objects.active = o 15 bpy.ops.mesh.uv_texture_add() 16 bpy.ops.object.editmode_toggle() 17 bpy.ops.uv.smart_project() 18

19 def main():

20 #import a mesh file to apply the uv map on

21 bpy.ops.import_scene.obj(filepath = path_to_obj) 22

23 #since a new blender session always starts with a cube

24 #we have to clean it up by removing the cube before we do

25 #anything else

26 clean_up()

27

28 #apply the uv map

29 uv_map()

30

31 #export the mesh, in this case .fbx is used.

32 bpy.ops.export_scene.fbx(filepath = path_to_save) 33

(16)

CHAPTER 2. UV MAPPING Link¨oping University

The key in listing 4 is bpy.ops.uv.smart_project(), which is a function in Blender that automatically creates and applies a UV map for the target mesh. The script is being executed once with exactly one model as input. If, for example, one were to run this script to apply UV maps to a whole car, the script would have to be executed once for each door, tire, seat and so forth.

2.4

Results

The results of running the script on the test models were as desired.

Figure 2.2: 3D model without a UV map

Figure 2.3: 3D model with a UV map

The models in figures 2.2 and 2.3 had the same textures applied to them and it was expected that the difference between them was extensive.

(17)

Chapter 3

Polygon reduction

3.1

Prestudy

The prestudy for this chapter consisted of researching different ways of reducing the polygons in a 3D mesh. I found two studies [4, 6] which were interesting in relation to this topic, but there are countless of other studies that touch the same subject. In the end, I chose to couple the theory in this chapter with the article written by S. Melax, mainly because the approach was easy and quick to understand. The study by H. Hoppe et al. is more complex and should take more time to process, and while it would probably give better results, the execution time of the resulting algorithm would most certainly be longer than the one found in [6].

3.2

Theory

Since an .obj file mostly consists of vertices and faces, reading the file and extracting the relevant information into a suitable data structure is an easy task. The faces consist of three vertices v1, v2 and v3, which are three points in R3, with coordinates x, y and z.

The algorithm used to perform the optimization was inspired by the one used in [6], and it is important to know that even though the algorithm that Melax developed uses triangles, faces work as well, since the faces in this context are the same as triangles.

The general idea is to first determine the cost of collapsing each vertex in the mesh onto it is respective neighbors, and setting the prime candidate for that specific vertex to be the one with the lowest cost. When the cost has been calculated for all vertices in the model, the vertex with the lowest cost is collapsed onto its candidate, and the remaining neighboring vertices’ costs are recalculated. This process is repeated until all vertices below a certain value, the tolerance, have been effectively removed from the mesh.

(18)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

3.2.1

Calculating the cost

There are many ways to calculate the cost, or the weight, of each vertex in a mesh. The cost represents, in a sense, how much of the topology will be lost when a certain vertex v is collapsed on to another vertex u. The function that calculates the cost used in this thesis is based on an equation that S. Melax used in [6]. The general idea is to compare the two vertices in question, and look at their respective curvature and distance from each other. This is to prevent the mesh from losing too much its topology.

The equation for the cost function is, as written in [6]:

cost(u, v) = ||u − v|| × max{min{(1 − f.normal · n.normal)

2 }}

Where f ∈ Fu and n ∈ Fw, and the resulting algorithm for calculating the costs of collapsing u onto each of its neighbors is as follows.

Algorithm 1: CollapseCost Data: Vertex u

Result: Float cost Fu← u.f aces; Nu← u.neighbors; foreach n ∈ Nu do Fn← n.f aces; Fw← Intersection(Fu, Fn); length ← (u − n).N orm; curvature ← 0.0; if size(Fw) == 0 then return highCost; else foreach fi∈ Fu do minCurv ← 1.0; foreach fj ∈ Fw do

dotP roduct ← u.normal · n.normal; minCurv ← M in(minCurv,1−dotP roduct2 ); end

curvature ← M ax(curvature, minCurv); end

end end

cost ← curvature · length; return cost ;

3.2.2

Collapsing the vertices

When all the costs for the vertices in the mesh are calculated, the next process is to collapse the vertex with the lowest cost onto its candidate. Collapsing a vertex

(19)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

onto another is a rather delicate matter and it is important that everything stays intact during the process. As stated by Melax in [6], there are basically three steps involved in collapsing a vertex u onto v. Firstly, the faces that contain the edge uv, which is typically two faces, have to be removed. Secondly, the references of u need to be changed to v, this means updating faces that contain u and replacing u with v in the sets of neighbors where v is non present. Lastly, u is deleted from the data set, which also means removing it from sets that include both u and v, typically in the set of neighbors of the lone unique vertices in the faces that contain both u and v.

Example of collapsing a vertex onto another

Figure 3.1: A K4 graph

The theory to this process is best explained with an example. In this K4 graph, there are five vertices; s, t, u, v and w, along with four faces; A, B, C and D. Each vertex has a set N of neighboring vertices and a set F of faces that contain the vertex in question.

Deleting faces

To start the process, the faces A and C are removed. This means that A and C are both removed from Fu and Fv and also from the last two sets of faces of the remaining vertices that make up A and C, namely Fw and Ft. When this is done, it is sure that there are no links to A and C.

(20)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

Re-referencing vertices

In this case, there are two faces remaining in Fu; B and D. Since u is about to be removed, the links to u in the faces must be replaced with the collapse candidate v. This gives that after a re-reference update, B = (s, v, w) and D = (s, t, v). It is also important to include both B and D in Fv, otherwise the algorithm will not work properly. There is one last step to the re-referencing part, and that is to iterate through the neighbors of u, and look at their sets of neighbors N . For the vertices that only contain u of (u, v), u is replaced with v or the other vertex that only contain u. There are two vertices whose sets of neighbors only contain u, and that is s and v where Ns= (t, u, w) and Nv= (t, u, w), so the updated sets are Ns= (t, v, w) and Nv= (t, s, w). Deleting the vertex

The very last step of collapsing u onto v is to delete u. What this means is that for all remaining sets that contain u, more precisely the sets of neighbors of u’s neighbors, that both contain u and v, u is removed. So u is removed from Nw= Nt= (s, u, v) with the results being Nw= Nt= (s, v). After this removal is done, it is safe to delete u from the main set of vertices.

Figure 3.2: The result

As seen in figure 3.2, the topology stays the same while the algorithm has removed a total of one vertex, two faces and three edges.

(21)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

3.3

Implementation

The two scripts produced by this chapter are as follows:

1 def vertex_collapse_cost(v): 2 global vertices

3 if not vertices[v].neighbors:

4 return

5

6 for u in vertices[v].neighbors:

7 cost = collapse_cost(v, u)

8 if cost < vertices[v].cost:

9 vertices[v].candidate = u

10 vertices[v].cost = cost

11

12 def collapse_cost(u, v): 13 global faces, vertices

14 if u == v: 15 return 1000000 16 17 if v not in vertices: 18 return 1000000 19 20 u = vertices[u] 21 v = vertices[v] 22 uv_vector = (v-u) 23 length = uv_vector.Norm() 24 curvature = 0.0

25 fuv = Intersection(u.faces, v.faces) 26

27 if len(fuv) == 0:

28 return 1000000

29

30 for face in u.faces:

31 min_curv = 1

32 for f in fuv:

33 dot = faces[face].normal * faces[f].normal 34 min_curv = Min(min_curv, (1-dot)/2) 35 curvature = Max(curvature, min_curv) 36

37 cost = curvature * length 38 return cost

39

(22)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

1 def collapse_vertex(u_id, v_id): 2 global vertices, faces

3 if u_id == v_id or v_id not in vertices:

4 return

5

6 u_vertex = vertices[u_id] 7 v_vertex = vertices[v_id]

8 first_neighbors = u_vertex.neighbors

9 both = [f for f in u_vertex.faces if f in v_vertex.faces] 10

11 #Remove faces that are on edge uv

12 for f_id in both:

13 f_face = faces[f_id]

14 w_id = f_face.get_unique(u_id, v_id)

15 w_vertex = vertices[w_id] 16 17 u_vertex.remove_face(f_id) 18 v_vertex.remove_face(f_id) 19 w_vertex.remove_face(f_id) 20 21 w_vertex.remove_neighbor(u_id) 22 u_vertex.remove_neighbor(w_id) 23 24 del faces[f_id] 25 26 #re-reference

27 for f_id in u_vertex.faces:

28 faces[f_id].replace_vertex(u_id, v_id) 29 u_vertex.remove_face(f_id)

30 v_vertex.add_face(f_id) 31

32 for n_id in u_vertex.neighbors:

33 n_vertex = vertices[n_id]

34 n_vertex.replace_neighbor(u_id, v_id) 35 v_vertex.replace_neighbor(u_id, n_id) 36

37 for n_id in first_neighbors:

38 vertices[n_id].remove_neighbor(u_id) 39 calculate_vertex_cost(n_id)

40 if vertices[n_id].candidate == u_id:

41 vertices[n_id].reset_candidate()

42

43 remove_from_neighbors(u_id) 44 del vertices[u_id]

Listing 6: Vertex collapse script

(23)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

3.4

Results

In this section I will go through how I tested the algorithm and show the results.

3.4.1

Testing method

To test the algorithm, it was run on a .obj file with 50 different tolerances between 0.10 and 5.00 with a 0.10 increment. The data that was collected included tolerance, run-time, vertex removal in percent and the total number of vertices that were left after the polygon reduction. The computer it was tested on included a 4 core processor at 2.9 GHz, along with 8 gigabytes of RAM. For the purpose of diversifying the results, the test was iterated through a total of five times.

3.4.2

The Stanford bunny

The model used for the tests was the Stanford bunny [1], which originally has 2503 vertices and 4968 faces, and the file itself is 202 kilobytes in size.

(24)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

3.4.3

Visual differences

After running the tests, two results were chosen to show the reduction in poly-gons.

Figure 3.4: Reduced mesh with 2.5 tolerance

The mesh in figure 3.4 consists of 1081 vertices and 2134 faces, with a total vertex reduction of 59.8 %. The file size was reduced from 202 to 65 kilobytes. It is clear that the topology of the mesh is intact.

Figure 3.5: Reduced mesh with 5.0 tolerance

The mesh in figure 3.5 is the final version of the bunny, with 680 vertices and 1318 faces, its total number of vertices was reduced by 72.8 % and the file size was reduced to 44 kilobytes. The topology is still somewhat intact, apart from a small section that has been added between the bunny’s back and tail.

(25)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

3.4.4

Data collection

Figure 3.6: Vertex reduction vs Tolerance

It would seem like as the tolerance increases, the vertex reduction approaches a number around 80 %, based on the plot in figure 3.6. These results are only from one iteration, because the results did not differ between the iterations, since the costs are calculated in the same order every time.

(26)

CHAPTER 3. POLYGON REDUCTION Link¨oping University

Figure 3.7: Execution time vs Tolerance

As shown in the plot in figure 3.7, the run-times differ somewhat between the different iterations. The maximal difference between two iterations of the same tolerance is between 1.5 and 2.0 seconds.

(27)

Chapter 4

Discussion

This final chapter will cover some discussion over the results, the method and what the future work should consist of as well as a closing conclusion. The last research question will also be answered in relation to both chapter 2 and 3.

4.1

Results

4.1.1

UV mapping

As seen in chapter 2, section 2.4, the results were as expected. I was not interested in measuring the execution time of the process because it had to be done either way. The company needs the models to include a UV map, otherwise their entire idea of showing them in a real-time environment would become pointless, since the models would lack appropriate textures.

4.1.2

Polygon reduction

There were many different results in chapter 3, and there is a reason for it: working with small models allows for the computations to be affected by multiple factors, which include the current processes being run on the test system. In fact, the tests were not made in an isolated computer environment, they were instead run on my home computer with several other applications running at the same time, which may have had a part in the differences between the iterations. If the test would have included a larger model, something around a couple of hundred megabytes in size, the differences between the iterations may have been smaller in proportion.

At first, the visual results of the models were severely disappointing: the topology was often ruined and one could barely make out a bunny out of the model. The solution for this actually came up to me when I wrote the theory part of this thesis: I had forgotten a very important step in the re-referencing part of the collapse algorithm, which made it so that the sets of neighbors of the remaining vertices did not include the candidate. Also, the remaining faces

(28)

CHAPTER 4. DISCUSSION Link¨oping University

did not include the candidate. After revising the collapse algorithm, this was solved and the models looked much better instantly.

4.2

Method

Since the method of this thesis was somewhat vaguely defined, it was difficult to establish a systematic approach to revisiting certain parts of the development phase. For example, it was clear to me that in the beginning, Blender would be my choice of 3D editing software. However, maybe the process of developing an UV mapper yourself was perfectly feasible for this thesis. It was also difficult to find any meaningful resources that were of some scientific height in chapter 2. While Mullen [7] has been cited quite a few times, it does not inherently mean that the book is appropriate for a thesis.

If I would redo this project, I probably wouldn’t have divided the thesis into two separate sub-projects. This turned out to be confusing at parts and the two chapters differ a great deal in their respective references’ scientific height. Hugues Hoppe, in[4], is a very acknowledged researcher and has spent a great amount of time on research relevant to 3D editing and rendering, which would make his sources and claims credible as is. In turn, this means that the article that Melax wrote [6], which is based on theory from Hoppe, would be credible as well.

4.2.1

Replicability, Reliability and Validity

It is my belief that this thesis would be fully replicable with my methods, since everything used has been either free or open source. I would not claim my methods to be the best, but it is clear that they did produce something with decent results.

4.3

The work in a wider context

The only real ethical point of view to this thesis is the software licensing choice. The Blender license [2] allows users to distribute their own work, and XperDi are most likely leaving the license of the software created in relation to this thesis as the same as Blender.

(29)

CHAPTER 4. DISCUSSION Link¨oping University

4.4

Conclusion

4.4.1

Answer to the first research question

How can one automate the process of creating UV maps for an ar-bitrary amount of models? In chapter 2, it has been shown that it both is possible and feasible to automate the process of creating a UV map for exactly one model. It has also been shown how this process can be formed. To do this for an arbitrary amount of models, it is trivial to iterate through a set of models and execute the script once for each of them.

4.4.2

Answer to the second research question

Is there a way to efficiently optimize the models with regard to per-formance during run-time? In chapter 3, it has been shown that there is indeed a way to optimize the models in such a way that it would be feasible to do it during run-time, using an already established algorithm for both calculating the cost of doing so as well as the actual deletion of the data.

4.4.3

Answer to the third research question

UV mapping

Would it be best to use proprietary, open source or self written soft-ware to answer the previous questions? Given that the company is a start-up with many concerns, it would be perfectly fine to use the method de-scribed in chapter 2 to automate the UV mapping process. The results are as expected and they are definitely good enough to be a proof of concept. How-ever, I reckon that with the growth of the company, the decoupling of third party software is necessary. It would make sense for them to write their own UV mapping algorithm without the use of open source software, just to keep everything in-house.

Polygon reduction

Would it be best to use proprietary, open source or self written soft-ware to answer the previous questions? For chapter 3, this question is a bit more interesting. This thesis showed that it is feasible to write your own software to perform polygon reduction on a number of models in real-time, but that does not evade the fact that the area of research has been figuring this out since 3D graphics first became popular, and as a result of this there are a great many established tools of varying licenses that perform this task. Then, again, in light of XperDi being a start-up, the implementation in chapter 3 would suffice for a proof of concept.

(30)

CHAPTER 4. DISCUSSION Link¨oping University

4.4.4

Future work

From my point of view, the main issue with the quality of the developed scripts is the execution times. A model which is 200 kilobytes in size is regarded as a small model in the CAD world, and in the worst case that model takes more than 11 seconds to optimize. This is not good enough, and the reason for that, apart from the algorithm itself, is that the script is developed without multi-threading in mind. This means that for both the UV mapping and the polygon reduction scripts, multi-threading would most likely help cutting down execution times by a fair amount. The future work of this thesis and the resulting software would be to implement the scripts in a multi-threaded fashion, and to test it in a real world rendering environment with dedicated rendering systems rather than on an office computer.

(31)

References

[1] Matthew Fisher. The Stanford bunny. https : / / graphics . stanford . edu/~mdfisher/Data/Meshes/bunny.obj Accessed 2016-05-17. [2] Blender Foundation. Blender license. https://www.blender.org/about/

license/ Accessed 2016-04-14.

[3] Free Software Foundation. GNU General Public License. http://www. gnu.org/licenses/gpl-3.0.en.html Accessed 2016-04-14.

[4] Hugues Hoppe et al. “Mesh optimization”. In: Proceedings of the 20th annual conference on Computer graphics and interactive techniques. ACM. 1993, pp. 19–26.

[5] Ideasman42. BlenderAsPythonModule. https : / / wiki . blender . org / index.php/User:Ideasman42/BlenderAsPyModule Accessed 2016-02-05.

[6] Stan Melax. “A simple, fast, and effective polygon reduction algorithm”. In: Game Developer 11 (1998). http : / / dev . gameres . com / program / visual/3d/PolygonReduction.pdf Accessed 2016-01-29.

[7] Tony Mullen. Mastering blender. John Wiley & Sons, 2011. isbn: 978-0-470-40741-7.

[8] Martin Reddy. OBJ Specification. http://www.martinreddy.net/gfx/ 3d/OBJ.spec Accessed 2016-02-17.

[9] Winston W. Royce. Managing the development of large software systems. https : / / www . cs . umd . edu / class / spring2003 / cmsc838p / Process / waterfall.pdf Accessed 2016-04-13.

[10] ton. FBX binary file format specification. https://code.blender.org/ 2013/08/fbx-binary-file-format-specification/ Accessed 2016-04-10.

[11] Regents of the University of California. BSD License. https://opensource. org/licenses/bsd-license.php Accessed 2016-04-14.

(32)

Code listings

1 STEP file snippet . . . 9

2 .obj file example . . . 10

3 .fbx snippet from [10] . . . 10

4 UV mapping script . . . 13

5 Cost calculating script . . . 19

6 Vertex collapse script . . . 20

(33)

List of Figures

1.1 StepImporter . . . 6

2.1 UV mapping menu in Blender . . . 12

2.2 3D model without a UV map . . . 14

2.3 3D model with a UV map . . . 14

3.1 A K4 graph . . . 17

3.2 The result . . . 18

3.3 The original mesh . . . 21

3.4 Reduced mesh with 2.5 tolerance . . . 22

3.5 Reduced mesh with 5.0 tolerance . . . 22

3.6 Vertex reduction vs Tolerance . . . 23

References

Related documents

46 Konkreta exempel skulle kunna vara främjandeinsatser för affärsänglar/affärsängelnätverk, skapa arenor där aktörer från utbuds- och efterfrågesidan kan mötas eller

Both Brazil and Sweden have made bilateral cooperation in areas of technology and innovation a top priority. It has been formalized in a series of agreements and made explicit

The increasing availability of data and attention to services has increased the understanding of the contribution of services to innovation and productivity in

Generella styrmedel kan ha varit mindre verksamma än man har trott De generella styrmedlen, till skillnad från de specifika styrmedlen, har kommit att användas i större

The thesis will use the variables Emissions per capita 2010 and Emissions per capita 2030 to explain the per capita greenhouse gas emissions countries had in 2010 and the

The fifth theme that was drawn from the interviews is connected to the issue of change in information technology. We asked the suppliers a number of questions emphasising this

Industrial Emissions Directive, supplemented by horizontal legislation (e.g., Framework Directives on Waste and Water, Emissions Trading System, etc) and guidance on operating

The EU exports of waste abroad have negative environmental and public health consequences in the countries of destination, while resources for the circular economy.. domestically