logo
 drop

Main

Community

Submissions

Help

84242204 visitors

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Valendian

Pages: [1]
1
Gaming Discussion / Re: Vagrant Story Graphics Hack Released.
« on: September 14, 2014, 06:34:55 am »
Hmm... Model Swap + Twin Blades + All Debug Rooms

Just a graphics hack you say?

2
In answer to the question cross posted to the following forum

http://www.gamefaqs.com/boards/914326-vagrant-story/70000670#8

I thought it best to keep all this in one place.

Quote from: creeperton
Ok, time for an update.

Vehek found the value I need to change to make that one code permanent. He found it in BATTLE.ARC.
http://www.romhacking.net/forum/index.php?topic=18596

It turns out that BATTLE.ARC is compressed. It uses LZS compression, which is handy because there are tools to work with that at Qhimm.
http://forums.qhimm.com/index.php?topic=15521

LZS tools:
http://forums.qhimm.com/index.php?topic=15325.0

An *.ARC file is an uncompressed archive.
http://biolab.warsworldnews.com/viewtopic.php?f=3&t=125

I split BATTLE.ARC into it's 2 subfiles, subfile #0 and subfile #1. #0 apparently doesn't use LZS compression - I'll scan it with Trid later. #1 is LZS and it decompresses just fine.

My current questions for you are semi-related to these things.

My spreadsheets don't work too great because I don't have a tool that allows me to import multiple files into a disc image all at once. This is a problem because I have a directory of 256 files that I need to import into a disc image. CD Mage doesn't let me do this, nor does CD Tool or cdprog.

The solution is to work directly with the disc image.

When I was making my magic-only mod I needed to calculate the defense values of the armors in the game. This is a pain, because there are 5 bytes which determine these values. They work like this:

base defense = signed byte
defense boost 1 = signed byte
defense boost 2 = signed byte
defense selector 1 = 8 bits, 1 for each element
defense selector 2 = 8 bits, 1 for each element

=IF(defense selector 1 = yes)AND(defense selector 2 = yes)
THEN(effective defense = base defense + defense boost 1 + defense boost 2)

=IF(defense selector 1 = yes)AND(defense selector 2 = no)
THEN(effective defense = base defense + defense boost 1)

=IF(defense selector 1 = no)AND(defense selector 2 = yes)
THEN(effective defense = base defense + defense boost 2)

=IF(defense selector 1 = no)AND(defense selector 2 = no)
THEN(effective defense = base defense)

The actual spreadsheet is more involved than this, this is just a summary.

My point is that after doing this I realized that I can easily calculate base addresses to patch to each file in the disc image, but only if I know exactly how those addresses are found in the disc image.

Is there a specification for how *.bin/*.cue disc images are organized? Like is there a header which lists the starting points of each file, is there some relationship between the size, or name, or type, or location of a file in the human-readable disc image you see when you open it in CD Mage - and the location of that file in the human-unfriendly version?

Also what is error correction data? How can I locate it? Is it located in certain fixed places in a disc image (every 40,000 bytes)?

For example, let's say I have a disc image called "game.bin" with 3 files in it.

file-------length (bytes)
game.exe---80000
vid.str----200000
info.arc---40000

How would I find the location and length of each of these files in game.bin? How would I find error correction data?
---
http://biolab.warsworldnews.com/index.php
SaGa Frontier Community Forum

Yes there are standard specifications. A whole host of them, collectively known as ISO9660, or more informally as "The Rainbow Books". Specifically ECM 119 and ECM 130. You don't need to worry about this stuff unless you are building CD Authoring Software. But I will give you an overview of the topic and you can look further if you wish.

http://www.ecma-international.org/publications/standards/Ecma-119.htm
http://www.ecma-international.org/publications/standards/Ecma-130.htm

ECM 130 specifies the physical structure of a CDROM. This details how the sectors are laid out, and some of the various formats a sector may take. The error correction is found within the sector. Each sector contains its own error detection and correction. There are different versions of sectors for different purposes. Music CDs don't require strong error correction as they can simply interpolate between that last good sector and the next good sector and no one would be the wiser. But data CDs place a much higher burden on the error correction. Such sectors sacrifice useable disc space for integrity against defects. It's a compromise between capacity and recoverability. But the error detection and correction is implemented as a Reed Solomon Product Like Code, with two channels. The inner channel ( called P Parity in ECM 130 ) is much weaker than the outer channel ( Q Parity ). P Parity can correct up to 2 bytes within a 28 byte block, but more importantly P Parity can be used to flag which 28 byte blocks contain errors that can be handled by the stronger outer channel which can recover up to 4 bytes per 28 byte block. But taken together both P and Q parity can provide almost perfect protection from defects. For you're information a P Parity check looks like this
 P(x) = x^8 + x^4 + x^3 + x^2 + 1

To understand the Virtual File System you will need to read ECM 119. This will tell you more than you need to know about how files are organised on the physical disc. How sectors can be allocated to files, how you can move files to a new range of sectors, how you can resize a file, or even create new files and directories.

The first 16 sectors are reserved for system use. Sony use these sectors to store their license agreement data ( which by the way has invalid error correction / detection by design ) Then sector LBA=16 contains the Primary Volume Descriptor, all PSX discs have only one such descriptor but there may be multiple descriptors, called supplementary volume descriptors. PSX doesn't use them. The last volume descriptor is called the volume descriptor set terminator. This is found at sector LBA=17, Next follows the PathTable which is duplicated 4 times ( twice in little endian and twice in big endian for redundancy purposes ) You can use the Path Tables as a quick way to locate files given a file path. The path tables store only references to directories with no references to the files they contain. So to use the path table method you seek the directory then you scan that directory for the file. Its like a short cut, but actually requires that you write more code to make use of it. as the act of scanning a directory is all that is actually required and can be achieved without use of the path table. Once all of the path tables have been defined the next sector begins the root directory record. This contains directory entries for all the sub directories and files contained within the root directory. These record entries will tell you the file name its size and its sector number ( in the form of an LBA ). The only differences between an entry that refers to a sub directory and an entry that refers to a file are
[1] a sub directory will have the directory flag set in the flags field
[2] a file will have an extension and a ";1" appended ( if there are multiple files with the same name you would have "file.ext;1" then "file.ext;2" and "file.ext;3" and so on. ) The file names are listed in alphabetical order with no distinction between files and directories.

Now once you get to the stage where you have scanned a directory, found the file you are looking for and read all the sectors that compose that file into memory we begin to leave the world of nice specifications to follow and enter the world of Custom Virtual File Systems. This is when a software developer has implemented their own version of a Virtual File System that breaks up a huge archive into little bite sized chunks. This is why many people here will tell you that you don't need to deal directly with the ISO9660 File System as you may still have to deal directly with sectors and LBA's by manipulating these custom virtual file systems.

How do you find these LBA tables? Well you need to run pSX with logging enabled and you need to log all CDROM IO. now just play the game as normal and save the log. Any time you see a line in the log such as

[015b009c] cdrom: setloc 00000000:00000002:00000005

it will be closely followed by lines such as these
[015b00a4] cdrom: read byte 07801800 = 09 (800584dc)
[015b00a4] cdrom: write byte 07801800 = 01 (800584ec)

Those address in the brackets
 (800584dc)
 (800584ec)
Are a part of the function "setloc". mark the entry point of that function in your disassembly.
Place break points at the entry point. and follow execution to the jr without entering function calls ( use F7 to step into if you reach a jal use F6 to step over ) Once you have reached a range of addresses that you are familiar with you can reload a previous save state and inspect what is happening here. Where is that game engine obtaining the LBA? That will almost always be a large table of sector addresses


3
Newcomer's Board / Re: Vagrant Story [PSX] SHP Model Format
« on: May 03, 2013, 02:02:19 pm »
The playstation lacked a floating point unit so all psx games use fixed point in place of floating point. The values are stored as normal signed integers. The Q format used can vary widely but Q16.16 or Q8.8 are the most logical choices. I Just do a straight cast to float so the floats are integers in the range -65535.0 to +65535.0.

Another thing to watch out for is that the polygons index the vertices by a sort of compromise shifted offset. Its one shift left to make it a true offset and two shift rights are reqjired to make it an array index.

There are duplicate polygons no idea why but they are there and if you figure out why some polygons need to be duplicated please let me know.

Some models still don't parse correctly, perhaps these models we works in progress that got cut from the final production.

Any ways I will share my shp / wep model viewer and its source code via PM. It is not animated but you can learn from it.

Please keep in touch with me.


4
ROM Hacking Discussion / Re: Replacement for CD Mage in the works
« on: March 04, 2012, 11:51:19 pm »
For most PS1 games there are no such tools. I'm thinking of the future reverser that wants to break new ground and explore games that haven't been touched before. This is a situation where generic tools are the only option. You wouldn't need to resort to such a low level approach if there were tools already available that did the job.

I do understand you're point about keeping all the files on disk and keeping the image "clean". But you still have to keep a virgin image somewhere and make copies of it that you can mangle to your hearts content. I do this myself. But what I'm worried about is the time that is wasted to extract the file from the image to disk only to open the file in a hex editor make a few edits and reinsert back into the image. This could be so much more productive and user friendly. Just stick a plugin architecture onto the image tool. There could be a hex editor plugin, or a disassembler plugin, or a TIM plugin. Who knows what a determined user would like to add to it.

5
ROM Hacking Discussion / Re: Replacement for CD Mage in the works
« on: March 04, 2012, 12:04:27 pm »
@Gemini: It's true that the game may ignore the TOC entirely, but that doesn't mean that it isn't important to maintain the TOC. How would you go about extracting/importing files using standard CD image tools if the TOC is invalid? Does it not make sense to keep the files readily accessible. The modder would need to maintain the internal LBAs and filesizes. Which is something you would have to do anyways. But the files would still be visible to general purpose tools. Not just dumped somewhere in the image and out of reach.

6
Personal Projects / Re: Code Naturalizer
« on: October 24, 2010, 10:10:50 pm »
So is this gonna be a tool that generates the type of comments an amatuer asm coder would write or will it actually be more of a decompiler?

If its gonna be a decompiler I'd suggest you forget about the natural language interface and just focus on decompilation. The ideal user of such a tool would be someone well versed in coding asm and some form of HLL. Cater to this persons needs, let the person who's only learning asm pick things up in their own good time.

The difference is a tool that looks at a push opcode and says "a value is being pushed on top of the stack" rather than a tool that analyzes further and finds that this is a local variable that is kept on the stack and it appears to be an unsigned int, or a pointer to a structure which is made up of 5 int's and 2 char's. It's this kind of feedback that is needed IMO.

http://www.backerstreet.com/decompiler/introduction.htm

Pages: [1]