Jump to content
RESET Forums (homeservershow.com)

Ceton Tuner passed thru to a VM on server 2008 R2


Greg Welch
 Share

Recommended Posts

Before i go yanking my ceton out of my HTPC that is working great,

 

im interested in the possibility of putting it in my 2008 R2 monster server and mapping it to my HTPC VM that i use for commercial skipping now.

 

i already have the HTPC VM output the xml's from Show analyzer to the WHS 2011 recorded tv folder and have the DVRMS add-in run on the client HTPC. works great commercials are skipping away o.k. still a few little tweeks but very happy

 

 

so i gues the short question is has anyone passed thru the ceton to a VM of Windows 7 Ultimate ?

Link to comment
Share on other sites

Have not done it but if you can install the ceton in the host and let the host control it, I would think it should work fine. Again, I have not tried it just a theory.

Link to comment
Share on other sites

Guest no-control

AFAIK PCIe pass through is not available in Hyper-V. It's the primary reason I went with the HDHR a few years ago. It's also the reason I've been holding out for the HDHR Prime. It's taking too long and I'm strongly considering the Ceton card in my primary HTPC now.

Link to comment
Share on other sites

With the Ceton, you don't have to share it via PCI-e, you can actually share the tuners over the network, in a similar manner as the HDHomeRun's work. All cablecard tuners are actualy NIC's in the computer, so you have to bridge the Ceton "nic" with one of your host's nic's, and then those tuners are discoverable on any device that's on the same subnet. Ceton has some nice software they wrote that automates this whole process through a gui, and then you run the same software on the "guest" PC that doesn't actually have the Ceton installed but can access it via the network.

 

My only concern is that any PC you share the Ceton tuners to has to pass the digital cable advisor analyzer, and I don't remember if that checks to see if you have hardware video acceleration or not...as your VM setup won't have the good graphics, which in your case I know you don't need, but the tool was built to check for it.

 

I share my Ceton over the network from my HTPC to 2 other desktops to watch TV in other rooms right now and it works great. In your case all you would have to do is run the network sharing wizard on your HTPC right now, and try it out in a VM to see if it works before you go and pull it completely out. Let us know how it goes if you try it.

 

Edit - Here's a great walkthrough from MissingRemote on how it all works: http://www.missingremote.com/guide/ceton-infinitv-network-tuner-wizard-access-infinitv-tuners-multiple-pcs. Ceton actually hired the guy (Mikinho) from MissingRemote to work for them and keep coming up with cool software, so hopefully there's more where this came from.

Edited by wodysweb
Link to comment
Share on other sites

Thanks guys this is why we post we get great support :)

 

Thoughts:

 

moving the Ceton to the server/VM would allow me to put the HTPC to sleep

 

using a HDHR Prime would also allow me to put HTPC to sleep

 

so i think for the time being:

 

I will be leaving Ceton where it is on HTPC , so i can sleep. :)

 

 

p.s. i've shared a tuner to my pc but didnt use it much so put it back

 

 

 

Thanks Guys now to focus on fine tuning the commercial skipping

Link to comment
Share on other sites

With the Ceton, you don't have to share it via PCI-e, you can actually share the tuners over the network, in a similar manner as the HDHomeRun's work. All cablecard tuners are actualy NIC's in the computer, so you have to bridge the Ceton "nic" with one of your host's nic's, and then those tuners are discoverable on any device that's on the same subnet. Ceton has some nice software they wrote that automates this whole process through a gui, and then you run the same software on the "guest" PC that doesn't actually have the Ceton installed but can access it via the network.

 

My only concern is that any PC you share the Ceton tuners to has to pass the digital cable advisor analyzer, and I don't remember if that checks to see if you have hardware video acceleration or not...as your VM setup won't have the good graphics, which in your case I know you don't need, but the tool was built to check for it.

 

I share my Ceton over the network from my HTPC to 2 other desktops to watch TV in other rooms right now and it works great. In your case all you would have to do is run the network sharing wizard on your HTPC right now, and try it out in a VM to see if it works before you go and pull it completely out. Let us know how it goes if you try it.

 

Edit - Here's a great walkthrough from MissingRemote on how it all works: http://www.missingre...rs-multiple-pcs. Ceton actually hired the guy (Mikinho) from MissingRemote to work for them and keep coming up with cool software, so hopefully there's more where this came from.

Fantastic info. Unfortunately, being as I'm in Canada, I can't use it, but I'm sure others will find it very useful. Thanks for posting. it.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Similar Content

    • E3000
      By E3000
      Hey guys,
       
      Has anyone around here been successful in setting up Proxmox on a Gen8 using a HBA for storage?
    • E3000
      By E3000
      Hey guys, 
       
      Bit of a random question here, and I know it depends on many factors such as what they are doing etc, but how many VMs have you guys comfortably ran on Gen8 with an E3-1265v2 and 16GB RAM? When did you start to see a performance hit compared to what the Gen8 is generally capable of?
    • E3000
      By E3000
      Hello all,
       
      A few questions for those who use Type-1 Hypervisors on their Gen8 MicroServers...
       
      I am looking to try ESXi or ProxMox and have been reading a lot of the threads on here.
      Hopefully you guys can help with some harder to find answers I have been seeking.
       
      1) Which would be the better way to setup ProxMox:
           a) Hypervisor on Internal MicroSD, VMs installed on SSD in ODD Port, Data on 4x HDDs in bays.
           b) Hypervisor on Internal USB, VMs installed on SSD in ODD Port, Data on 4x HDDs in bays.
           c) Hypervisor and VMs both installed on same SSD (partitioned?) in ODD Port, Data on 4x HDDs in bays.
           d) Hypervisor on SSD using a USB-to-SATA cable on Internal USB, VMs installed on separate SSD in ODD Port, Data on 4x HDDs in bays.
       
      2) Would a 128GB SSD be a ‘waste‘ for installing a Hypervisor on? How much space is typically needed?
       
      3) How many VMs have you guys run on a Gen8 comfortably without it being sluggish?
       
      4) Everyone seems to be going RAID crazy these days. Is there any reason to use it if high-availability is not that necessary and a good backup plan is in place? What is wrong with separate disks (or singular Raid0s)?
       
      5) Does using Type-1 Hypervisors have any effect on the internal fans speed/noise? Is it possible to have 3-5 VMs running and still have the fan speed @~8% as it was when I was using 2 nested (Type-2) VMs?
       
      Sorry in advance if some of these questions are silly, common knowledge, or “depends on what you are doing in the VMs!” 😆
       
      Thanks in advance to all those that help!
    • Stampede
      By Stampede
      New to the forum, and really enjoying all the discussions. 
       
      I would love to get everyone's opinion on this. I'm looking to build my first home lab for working on virtualization and a number of work related and personal projects. I definitely see a lot of VM's in the future, not super high performance but maybe 10-20. There are a lot of things I'm looking to experiment with and learn so finding the most flexible base platform is my goal. This is something I'm going to be working with daily for as long as I possibly can keep it alive. 
       
      I'm going to run a full vCenter setup, preferably off an SSD and then the have 1TB Raid (1) for my VM's datastores. The big question is, it there a significant enough difference between the ML10 and ML10v2 to justify spending the extra $200. My budget is tight, just graduated school, and see that $200 as RAM and SSD money I could be spending. Also to specify I am comparing the Xeon E3-1220v2 to the Xeon E3-1220v3. 
       
       
      I'm trying to keep the total cap on spending < $500, I'm amazed that this is even possible at this price point to be honest. I'm glad I found this forum and the ML10's because before this I was looking at getting an ancient Intel server with dual xeons that would do a great job at turning my electricity into heat and noise. 
       
      Let me know what you guys think, Can't wait to share my progress with everyone. 
      Peter
    • cocksy
      By cocksy
      Hi All,
       
      I'm just about to start playing with virtualization for the first time on my new TS140 server and was after some advice. I currently have W10 installed on an SSD activated and running fine. I want to put Hyper V Server 2012 R2 on a separate SSD and then convert the physical installation of W10 to a VM using disk2VHD.
       
      So my question is: if I convert W10 to a VM and use that VM on the same physical hardware, will the existing activation still work? Or will it throw up issues?
       
      Thanks!
×
×
  • Create New...