It is currently Fri Mar 29, 2024 11:45 am


All times are UTC [ DST ]


Forum rules


When asking for help, make sure you are using the latest stable version of Daphne with the latest DaphneLoader. "For sale", auction, or ebay links do not belong here. Legal discussion is discouraged.



Post new topic Reply to topic  [ 51 posts ]  Go to page Previous  1, 2
Author Message
 Post subject:
PostPosted: Tue Oct 03, 2006 6:16 pm 
DAPHNE Bronze Donator
DAPHNE Bronze Donator

Joined: Thu Apr 27, 2006 5:07 am
Posts: 57
Location: Near Atlanta, GA
You can donate HERE.

That link is found HERE.


Top
Offline Profile  
 
 Post subject:
PostPosted: Wed Oct 04, 2006 1:13 am 
Grizzled Veteran
Grizzled Veteran

Joined: Thu Feb 21, 2002 1:00 am
Posts: 247
I agree with matt btw... interlaced is crap.

Just for the record too there is a big difference between interlaced ntsc on a tv and interlaced vga on an arcade monitor. Matt and Warren would be the experts on this but it was my understanding that the interlacing was done on the laserdisc player and not through the arcade pcb, whcih is kind of akin to progressive scan dvd players today... it doesn't look great on a standard tv, but it is acceptable. On the other hand, on a vga monitor it flickers so bad some people get headaches from it.

Now I don't know what the technical diference is between the two connection methods, but I personally can see the difference.


Top
Offline Profile  
 
 Post subject:
PostPosted: Wed Oct 04, 2006 3:53 am 
DAPHNE Team
DAPHNE Team

Joined: Thu Feb 08, 2001 1:00 am
Posts: 906
Location: Earth
HowardC wrote:
I agree with matt btw... interlaced is crap.


all standard NTSC tvs need interlaced sources. All NTSC sources need interlaced displays. NTSC laserdiscs are therefore interlaced, which means they look best when displayed on an interlaced monitor such as a standard television.

Interlacing only looks like "crap" when you are playing it back on a display it wasn't made for. It actually looks quite good when displayed properly.

This is why I have a television in my daphne cab. No conversions (degradations) are necessary like you would need for an arcade monitor. Even the original dragon's lair cabinet had an ntsc decoder in order to use an arcade monitor. I've removed that link in the chain for a superior picture direct from the interlaced source (laserdisc) to the interlacing display (television).


Top
Offline Profile  
 
 Post subject:
PostPosted: Wed Oct 04, 2006 9:35 am 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Huggybaby wrote:
You can donate HERE.


Thanks for the link, Huggybaby, I appreciate it.


Top
Offline Profile  
 
 Post subject: Accidental double post?
PostPosted: Wed Oct 04, 2006 9:45 am 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
:oops:


Last edited by lowlight on Wed Oct 04, 2006 5:20 pm, edited 1 time in total.

Top
Offline Profile  
 
 Post subject: Re: Thanks for the clarification...
PostPosted: Wed Oct 04, 2006 9:47 am 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
chip wrote:

all standard NTSC tvs need interlaced sources. All NTSC sources need interlaced displays. NTSC laserdiscs are therefore interlaced, which means they look best when displayed on an interlaced monitor such as a standard television...

This is why I have a television in my daphne cab. No conversions (degradations) are necessary like you would need for an arcade monitor...


Ah, I see. That being the case then, what are your thoughts on the interlaced switch value that Warren employs in his modeline values? If the original source footage hasn't been de-interlaced (a straight extraction of a laserdisc, for instance), would the resulting image be closer to the authenticty that the original game and a normal television set can provide with such a swicth ( -interlaced), or do you feel it would have minimal effect? Of course, this question assumes that either an ArcadeVGA card will be used and/or 15Khz will be used/forced via the cab's JAMMA harness (J-Pac adapter). Granted, once I get everything going, I'll ultimately end up using a modeline with or without the interlaced switch and end up being the judge of what looks better between the two, but the added points of view are always welcome and helpful!


Last edited by lowlight on Wed Oct 04, 2006 5:23 pm, edited 2 times in total.

Top
Offline Profile  
 
 Post subject:
PostPosted: Wed Oct 04, 2006 4:42 pm 
DAPHNE Creator
DAPHNE Creator

Joined: Sat Jan 20, 2001 1:00 am
Posts: 2127
Location: Salt Lake City,Utah, USA
chip wrote:
This is why I have a television in my daphne cab. No conversions (degradations) are necessary like you would need for an arcade monitor. Even the original dragon's lair cabinet had an ntsc decoder in order to use an arcade monitor. I've removed that link in the chain for a superior picture direct from the interlaced source (laserdisc) to the interlacing display (television).


Well, one thing I am not sure about is although the laserdisc is interlaced, is the video overlay (for non-Dragon's Lair games) interlaced?

I agree that a TV looks pretty good when using an S-Video cable (assuming you don't get an unnatural flickering with the video overlay which shouldn't be interlaced) but when using a composite cable with the chroma crawl, it looks like schlop (IMO) esp on the video overlay of games like cliff hanger ... :)

I personally don't really want my dragon's lair to look like its running on NTSC, I would prefer 24 FPS high resolution non-interlaced no-chroma-crawl images, such as what Digital Leisure is doing with their new "HD" releases. Hence I want something like the awesome Wells Gardner U3100 (or whatever it is) with direct VGA input... mmm...


Top
Offline Profile  
 
 Post subject:
PostPosted: Wed Oct 04, 2006 5:33 pm 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Matt Ownby wrote:
Well, one thing I am not sure about is although the laserdisc is interlaced, is the video overlay (for non-Dragon's Lair games) interlaced?


That's a great point you bring up, and one I hadn't thought about until you mentioned it (overlay based games). As I was able to procure a LD copy of MACH3 recently, that too is something I'll have to test out and see. Like I mentioned before, once I get everything going and set up correctly (hopefully), I look forward to documenting the specific differences each type of video senario would garner w2hen it comes to arcade/VGA monotors.

Matt Ownby wrote:
I personally don't really want my dragon's lair to look like its running on NTSC, I would prefer 24 FPS high resolution non-interlaced no-chroma-crawl images, such as what Digital Leisure is doing with their new "HD" releases. Hence I want something like the awesome Wells Gardner U3100 (or whatever it is) with direct VGA input... mmm...


Hmm, I guess I'm just a bit too old school then unfortunately, since I'd actually preffer to see it the way I remember seeing it back in the 80's :( . While HD is very nice, for me, a game like DL doesn't feel right if it's free of all of the original visual defects that were produced by the hardware; that's what feels authentic to me. It's sort of akin to those that preffer the look of actual pixels compared to soft filtered pixels for a lot of the classic arcade games. But again, that's purely a personal prefference and what I'm hoping to achieve without too much degradation.


Top
Offline Profile  
 
 Post subject:
PostPosted: Thu Oct 05, 2006 7:43 am 
Grizzled Veteran
Grizzled Veteran

Joined: Thu Feb 21, 2002 1:00 am
Posts: 247
chip wrote:
HowardC wrote:
I agree with matt btw... interlaced is crap.


all standard NTSC tvs need interlaced sources. All NTSC sources need interlaced displays. NTSC laserdiscs are therefore interlaced, which means they look best when displayed on an interlaced monitor such as a standard television.

Interlacing only looks like "crap" when you are playing it back on a display it wasn't made for. It actually looks quite good when displayed properly.



Well with that being said, you can't get a true ntsc signal from a pc without a great deal of effort because it is using tv-out scan conversion. While technically tv-out is ntsc the downsampling and overscan "crunching" going on will give you some serious crawl lines.

I use a tv as well and a prefer it, but since he isn't using a tv this is a non issue, like you said it only looks ok if you are using the display intended.


Top
Offline Profile  
 
 Post subject:
PostPosted: Thu Oct 05, 2006 1:12 pm 
DAPHNE Team
DAPHNE Team

Joined: Fri Jul 27, 2001 1:00 am
Posts: 295
Wow, lots of interesting issues here...

HowardC wrote:
Well with that being said, you can't get a true ntsc signal from a pc without a great deal of effort because it is using tv-out scan conversion. While technically tv-out is ntsc the downsampling and overscan "crunching" going on will give you some serious crawl lines.


You can get proper NTSC through the TV out if you use 640x480 with full overscan - no downsampling, no crunching. If there's a 'flicker filter' setting, you'd probably want that off, unless you want a little blurring.

Not all card/OS/driver combinations make this possible, though. I had to use an older nVidia driver for Linux when I set up tv out on a Gf4 -- the newest version broke the overscan option.

If you're using the RGB outputs at 15 KHz, then the rescaling issues go away entirely. You might still want to tweak the custom timing values to get more or less overscan, or adjust the H/V widths on the monitor, but that would be about it.

Matt Ownby wrote:
Well, one thing I am not sure about is although the laserdisc is interlaced, is the video overlay (for non-Dragon's Lair games) interlaced?


Yes and no. :) Technically, the overlay is interlaced, as it becomes part of the one video signal going to the monitor. However, most laser games don't change the overlay during a frame, so both interlaced fields of the frame have the same graphics. You can see a small amount of flicker (not much since the graphics are the same in both fields), but no extra resolution.

I think a few games (Astron Belt?) may update the display on every field, though, making it a 'proper' interlaced output. This doesn't add more resolution, it just makes a motion a bit smoother.

(Side note: there are a few non-laser games that use interlace - Spy Hunter and Rampage come to mind. The display is more detailed, but flickers a bit.)

In the case of Daphne, the overlay graphics are updated once per frame, so the interlaced fields will always be the same. I suppose it would be nice to have an option to double the update speed, but this would be a minor difference and cause a big performance hit. Might still be worth a look someday...

lowlight wrote:
While HD is very nice, for me, a game like DL doesn't feel right if it's free of all of the original visual defects that were produced by the hardware; that's what feels authentic to me.


I'm with you on this one -- cleaned up and/or HD versions of these games look really nice, but I also appreciate seeing the artifacts of the original technology. (I know Matt doesn't like the -seekdelay feature either, but I love it. :P I find that instant seeks throw off the 'rhythym' of the game.)

lowlight wrote:
If the original source footage hasn't been de-interlaced (a straight extraction of a laserdisc, for instance), would the resulting image be closer to the authenticty that the original game and a normal television set can provide with such a swicth ( -interlaced), or do you feel it would have minimal effect?


If you're using an interlaced display, I'd say use an interlaced mpeg captured from the laserdisc. I'm not 100% sure how libmpeg2 handles 3:2 pulldown for 24fps video streams, so you might not get the exact 3:2 field sequence. I can't say I've ever observed this difference when comparing the two, so it may be a non-issue. Some games have video that doesn't deal with inverse-telecine / deinterlacing well, so you'd definitely want to use an interlaced capture for an interlaced display for them.

While we're talking about maximum authenticity, I should probably mention that the finer dot pitch on the tri-sync monitor may make the interlace scanlines a bit more apparent than on a plain standard-res monitor or a TV.

On the other hand, a TV will have somewhat different NTSC to RGB decoding than the original game's monitor, so that still looks a bit different. For games with overlay, a TV will be significantly less sharp, or add NTSC decoding artifacts if you use a composite connection.

So what you really need is a standard-res arcade monitor, with the original NTSC decoder for non-overlay games, and a 15KHz RGB connection for overlay games. Sorry, everyone... :o

Hehe

-Warren.


Top
Offline Profile  
 
 Post subject:
PostPosted: Thu Oct 05, 2006 7:28 pm 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Warren Ondras wrote:
Wow, lots of interesting issues here...



Thanks for taking the time to address each one, I appreciate the knowledge.

Warren Ondras wrote:
If you're using the RGB outputs at 15 KHz, then the rescaling issues go away entirely. You might still want to tweak the custom timing values to get more or less overscan, or adjust the H/V widths on the monitor, but that would be about it.


Not a problem (adjusting for H/V). The rest of the adjustments will come down to trial and error, I'm sure.

Warren Ondras wrote:
Yes and no. :) Technically, the overlay is interlaced, as it becomes part of the one video signal going to the monitor. However, most laser games don't change the overlay during a frame, so both interlaced fields of the frame have the same graphics. You can see a small amount of flicker (not much since the graphics are the same in both fields), but no extra resolution.


Thanks for the clarification, as I was actually confused about this bit, myself.

Warren Ondras wrote:
While we're talking about maximum authenticity, I should probably mention that the finer dot pitch on the tri-sync monitor may make the interlace scanlines a bit more apparent than on a plain standard-res monitor or a TV.


Actually, it's quite subtle on this Nanao arcade monitor for most standard resolution games (scanlines). The only JAMMA games that seem to draw attention to the lines are those by Data East, I'm not quite sure why, and even then it is nowhere as conspicuous as the faux scanline feature that's provided by most general arcade emulators. Unlike the WG-9000 tri-sync monitors many folks are familliar with here in the states, the Nanao brand that Taito and SEGA use for their cabinets typical isolate the circuitry through the chassis itself. In my case, the 31KHz signal is being processed through a seperate set of circuitry within the chassis which is only available through the installed VGA cable, 15KHz & Medium resolutions get sent to the other side via the JAMMA harness and the RGB block within the cab.


Top
Offline Profile  
 
 Post subject:
PostPosted: Fri Oct 06, 2006 9:44 am 
DAPHNE Bronze Donator
DAPHNE Bronze Donator

Joined: Thu Apr 27, 2006 5:07 am
Posts: 57
Location: Near Atlanta, GA
lowlight, I just rediscovered this link: http://www.daphne-emu.com/donations/.

I was on the list briefly. :wink: :)


Top
Offline Profile  
 
 Post subject:
PostPosted: Sat Oct 07, 2006 12:23 am 
DAPHNE Team
DAPHNE Team

Joined: Mon Feb 12, 2001 1:00 am
Posts: 164
Warren Ondras wrote:
I think a few games (Astron Belt?) may update the display on every field, though, making it a 'proper' interlaced output. This doesn't add more resolution, it just makes a motion a bit smoother.


It isn't just Astron Belt, it's all the overlay games in Daphne. Run AB or Bega's Battle in NOLDP mode and you'll see how much more smoothly they run at 60Hz as opposed to 30Hz. Playing them in VLDP mode just doesn't do the originals justice. This is a longstanding issue that should be addressed. CPU speed has caught up enough that speed isn't an issue anymore.

Warren Ondras wrote:
In the case of Daphne, the overlay graphics are updated once per frame, so the interlaced fields will always be the same. I suppose it would be nice to have an option to double the update speed, but this would be a minor difference and cause a big performance hit. Might still be worth a look someday...


I once had a build of Daphne that did exactly that. It was kind of a hack so it was never submitted, but it makes more than a minor difference. Games that extensively use sprites for the gameplay are much more playable (especially Bega's), and everything looks much smoother. If speed is really a concern it could be an option.

Mark


Top
Offline Profile  
 
 Post subject: Improving Overlay-Based Game Performance
PostPosted: Sat Oct 07, 2006 8:34 am 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Mark Broadhead wrote:
I once had a build of Daphne that did exactly that (Doubling the update speed for Overlay Based games). It was kind of a hack so it was never submitted, but it makes more than a minor difference. Games that extensively use sprites for the gameplay are much more playable (especially Bega's), and everything looks much smoother. If speed is really a concern it could be an option.

Mark


Hi Mark, if you happen to still have the build of Daphne you modified to compensate for the overlay speed issue (sourcecode or compiled Linux version, of course), I'd be very interested in taking a look at it myself. Unless of course the upcoming version of Daphne will address this issue. How much system power (CPU/RAM) is required to appreciate the changes you made to the Daphne code?


Top
Offline Profile  
 
 Post subject:
PostPosted: Fri Oct 13, 2006 4:27 am 
DAPHNE Creator
DAPHNE Creator

Joined: Sat Jan 20, 2001 1:00 am
Posts: 2127
Location: Salt Lake City,Utah, USA
Mark Broadhead wrote:
It isn't just Astron Belt, it's all the overlay games in Daphne. Run AB or Bega's Battle in NOLDP mode and you'll see how much more smoothly they run at 60Hz as opposed to 30Hz. Playing them in VLDP mode just doesn't do the originals justice. This is a longstanding issue that should be addressed. CPU speed has caught up enough that speed isn't an issue anymore.


Mark, check the latest daphne (0.101.16) in OpenGL mode heh heh heh ...


Top
Offline Profile  
 
 Post subject: interesting development...
PostPosted: Fri Oct 13, 2006 6:42 am 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Matt Ownby wrote:
Mark, check the latest daphne (0.101.16) in OpenGL mode heh heh heh ...


A beta, eh? Looks like I may have to head to the chat room....


Top
Offline Profile  
 
 Post subject:
PostPosted: Sat Oct 14, 2006 4:49 am 
DAPHNE Team
DAPHNE Team

Joined: Mon Feb 12, 2001 1:00 am
Posts: 164
Matt Ownby wrote:
Mark, check the latest daphne (0.101.16) in OpenGL mode heh heh heh ...


Wow, playing seeing my spaceship blow up so smoothly in Astron Belt is so koll! Thanks Matt :)


Top
Offline Profile  
 
 Post subject:
PostPosted: Sat Oct 14, 2006 2:47 pm 
DAPHNE Team
DAPHNE Team

Joined: Fri Jul 27, 2001 1:00 am
Posts: 295
Matt Ownby wrote:
Mark, check the latest daphne (0.101.16) in OpenGL mode heh heh heh ...


Dangit, my trusty (crusty?) old GeForce3 doesn't do OpenGL 2.0! :(

What's a decent mid-range card that doesn't cost a fortune? I could probably get by with an nVidia 6200 or ATI 9550, but is there a better card for 3D that doesn't cost much more?

/me hates wading through Tom's and Anandtech...


Top
Offline Profile  
 
 Post subject:
PostPosted: Sat Oct 14, 2006 3:56 pm 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Warren Ondras wrote:
Dangit, my trusty (crusty?) old GeForce3 doesn't do OpenGL 2.0! :(

What's a decent mid-range card that doesn't cost a fortune?


Great question, here's another. In addition to what would be a good OGL 2.0 card, would that same card also be able to support 15KHz with approprate modeline settings in *NIX? If not, and 31Khz has to be used, which kind of card with OGL2.0 support would be most supported in a Debian install?


Top
Offline Profile  
 
 Post subject:
PostPosted: Sat Oct 14, 2006 5:43 pm 
DAPHNE Creator
DAPHNE Creator

Joined: Sat Jan 20, 2001 1:00 am
Posts: 2127
Location: Salt Lake City,Utah, USA
Warren Ondras wrote:
What's a decent mid-range card that doesn't cost a fortune? I could probably get by with an nVidia 6200 or ATI 9550, but is there a better card for 3D that doesn't cost much more?


I can confirm that the 6200 does work.

I have a 6800 ultra myself.

I don't know what the current prices are right now so that's about as much as I can tell you :)


Top
Offline Profile  
 
 Post subject:
PostPosted: Sun Oct 15, 2006 6:42 am 
DAPHNE Gold Donator
DAPHNE Gold Donator
User avatar

Joined: Wed Sep 27, 2006 11:32 am
Posts: 82
Location: California
Matt Ownby wrote:
I can confirm that the 6200 does work.

I have a 6800 ultra myself.

I don't know what the current prices are right now so that's about as much as I can tell you :)


That's cool to know. Perhaps Warren can confirm this, but I just found out that for proper OGL support to work with Nvidia and 3D apps correctly within X, the initiating user (other than root) should be part of the "video" group. Any thoughts (Matt or Warren) ?


Top
Offline Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 51 posts ]  Go to page Previous  1, 2

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users and 18 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
Theme created StylerBB.net