TechNet Top Questions - October 10, 2000

On This Page

Conundrum: Deploying Data Warehouse Project

"Permissions"

Just How Many User Accounts can I Squeeze into One Windows NT 4 Domain?

Two Networks, Two Cities: Migrate from Workgroup to Domain & Other Stuff

Analyzing Internet Information Server Statistics

SQL Server Load Balancing

Conundrum: Deploying Data Warehouse Project

Q: We needed to add new modules in our projects that used data warehousing. I implemented and integrated with our software products; however, I am facing this problem and I hope you assist me.

How can we deploy and distribute OLAP projects? Since all of our testing is done on our testing server, we need to distribute the same structure (measures, dimensions, cubes) to all our customers and packaged with the software.

Also is there a way to generate the scripts like the generation of table schema in SQL Server 7?

I will be very grateful if you could help me. Thanking you in advance.

Best regards,
Tommy

A: Tommy,

Our tool provides all the features you need.

- Ermanno Bonifazi

Another opinion:

Tommy,

There are several options:

You can archive and restore the processed OLAP database(s) and include it with or without the relational data in your application. Another option would be to copy/paste the OLAP database(s) (this option will copy only the definitions--no data) and include this along with the relational data in your application. End users will have to process the cubes themselves.

Lastly, you could include the Access repository (msmdrep.rep) file along with the relational data with your application.

There are custom tools out there that script cube/dimension/database definitions. Hopefully, someone on this alias can help you with this one.

Here is a posting from Ramukar on this alias recently. He developed this and it looks to be a very good tool. Here's what Ramukar says:

"I have posted a utility that does this. Please visit:

https://beta.communities.msn.com/MicrosoftOLAPServicesUsersCommunity

and look for the message in general section."

Regards,
Robert

OK! Excellent, excellent!! This is a very good (and I believe common) question and thanks a lot to Robert and Ermanno for their suggestions. One thing that I try to do is to save my legions of readers time by providing specific directions to resources that are referenced. If I can save one person 3 minutes, I'm happy. If I can then save 100 IT Pros 300 minutes, then I'm happy as a clam. But when you add up all the readers that need the details, well, that can be a couple of person-years worth of time that yours truly has put back into the labor pool! Now that makes me happier than a clam at high tide! Just my little contribution to the economy. Enough of that! Here are the details:

From the Web site that Robert referenced (above):

  1. Click on Message Board on the left side nav column

  2. Under Cube Design, look for the entry OLAP ScriptGenerator v1 - Saves cube designs as XML. Click on that.

  3. Therein lies the utility.

"Permissions"

Q: Hi All.

I want to create a directory structure with different permissions on each folder. After creating the structure, I want to use this structure like a template for future use. In fact, I need to replicate this tree with the permissions, too.

I know that if I copy a folder, I lose the permissions, but does anyone know a way to replicate the permissions and/or apply permission to a folder with a command-line? In this way, I can create a batch file.

Thanks in advance.
Giovanni R.

A: Giovanni,

You might want to look into Cacls utility (part of the Windows NT OS natively), Win2000's XCOPY using the /O parameter (which copies file ownership and ACL information), SCOPY (Windows NT 4 Resource Kit), and ROBOCOPY (Windows NT 4 Resource Kit).

-John R Buchan

Giovani,

Couldn't have said it better myself – thanks John! Since you didn't mention the version of the operating system that you are working with, you get multiple answers! For those readers that aren't familiar with John R. Buchan, he's a very frequent contributor to the Windows NT-related newsgroups. His recommendations, suggestions, and ideas are always very well received and appreciated.

Just How Many User Accounts can I Squeeze into One Windows NT 4 Domain?

Q: What is the maximum number of users you can have per domain in Windows NT 4.0?

I seem to recall a max of 1000 user accounts?

Mr. Noname

The first answer is pretty brief, and attempts to get right to the point:

A: Hi,

40000 users.

-anonymous

Actually, Noname, you can't really just say that "the maximum number of user accounts that one can have in a Windows NT 4.0 domain is X". It's not a simple as that. (What ever is??)

There are several factors that determine what the maximum number of accounts that can exist. The answer is not as easy as it was when the answer to "Life, The Universe, and Everything" was "42". Not very satisfying.

A good Knowledge Base article that address this is "130914 - Number of Users and

Groups Affects SAM Size of Domain"... here's the link to the article:

https://support.microsoft.com/default.aspx?scid=kb;en-us;130914&sd=tech

In the article, it states:

The total number of users, groups, and computers in the domain determines the overall size of the security accounts manager (SAM) database. The way groups within a domain are implemented also affects the size of the SAM database.

Because of the way the registry is managed, differences in group membership patterns, and variations in the frequency of SAM operations, it is difficult to provide exact numbers and limits for capacity planning.

Since 1995 when the original tests were made, additional tests using hardware available in the second half of 1998 indicate that Windows NT domain controllers can handle more user logons and larger SAM databases than the original recommendations.

Please review the KB article for more detail. It should help.

Two Networks, Two Cities: Migrate from Workgroup to Domain & Other Stuff

Q: Here is my situation:

Two networks, two different cities, connected by T-1 on each end.

City1 - Workstations authenticate to PDC. No BDC in place at this time. All internal IP addresses are static using Private IP addressing. Access to the Internet for all internal workstations provided using NAT. Internal workstations share one Legal IP for Internet access. Routers, switches, etc. maintained internally. I have a \26 class of legal IP addresses to work with here and not even close to that many workstations.

City2 - Basically the same scenario, however, there is no DC. Currently configured as Workgroup. Also using Private IP addressing with NAT for access to Internet. Routers, switch maintained by third party. We simply lease a T-1 from them and they provide all routing capabilities. I have 5 different legal IP addresses to work with (and 12 workstations).

What I need:

To migrate City2 from a Workgroup to a Domain and City2 to access resources on PDC in City1.

Here is what I am thinking:

Install a BDC in City2. Synchronize with City1. Can share resources like that. But, how will the IP addressing affect this?

OR

Install a PDC in City2. Establish a trust between each City. How will IP addressing be affected?

I hope I have provided enough info. A little help and guidance would be sincerely appreciated.

Jason Hensley

A: Hi Jason,

Sounds to me like you may have to implement PAT (port address translation) on the routers on both ends to associate requests through certain ports to a specific internal IP address (i.e., your servers). This can be done and would be your best option.

My other suggestion to you would be to add a second network card to each server and enable a "legal" IP on the second card of each server. This is, of course, also suggesting that you add a BDC to your second site. Take a small hub and interconnect the router and server's second card to it. Take a second hub/switch and add the workstations to it and the original card that was on the server. Change the gateway address of your workstations to reflect the internal IP on the first network card of the local server.

Now, because you are using the public Internet to transfer data, I would suggest that you implement a VPN. Microsoft's PPTP would suffice. In other words, create a VPN connection between the two (2) servers via the public IP address that you have added to the second card. You now have your communication link between the two.

To enhance security, I would suggest that you implement an IP security on your server to allow communication between your two servers via that public addressed card. In other words, Server 2 will only accept inbound communications on its external card on the VPN ports and vice versa. Also, disable certain services on the second card of the server such as NETBIOS, and disable IP forwarding between the two cards.

Keep in mind that when you first install your BDC, it must be able to see the PDC at the other site. I would suggest installing WINS on your PDC, add the second card, give it a public IP, register it with the WINS, and then when you set up your BDC be sure to use the WINS setting to point to your PDC.

This is a difficult thing to discuss via newsgroup postings, but I hope it helps in some way. Had no intentions of starting a long-winded response.

Regards,
Paul Foote

To which Jason has the inevitable follow-up question:

Which suggestion would have the best performance?

U and Paul's response:

IP addressing is a connectivity function of the network. If it's working now, don't worry about it. For the domain, I would recommend adding a BDC in City2 and operate a single domain. You should not see any performance difference between the domain scenarios.

Analyzing Internet Information Server Statistics

Q: Hello, Everybody,

Here's a quick question. I have been evaluating the Webtrends Log Analyzer for stats analysis. https://www.webtrends.com/products/log/default.htm

Are there other packages available for log file analysis?

Best regards,
Dale

A: To which John Oliver replies:

Check out "Analog" at https://www.analog.cx/ or "EasyStat" at https://www.easystat.net/. Both are good programs. EasyStat does not use log files, but is a kick butt program for Web site stats.

And Aaron Bertrand tosses in his two cents' worth (remember, readers, that these are the opinions of the newsgroup contributors):

I have to agree with Stuart, LiveStats is much more comprehensive than WebTrends.

-anonymous

And lastly, IT Pro Stuart Bowen offers the following:

I have been evaluating MediaHouse LiveStats 5.0. We currently use Webtrends and have found it can't do everything we need as an ISP/ASP. Go to MediaHouse's website at https://www.mediahouse.com.It is fairly straightforward to use, the support staff have been very helpful in altering the Web front-end, it is accurate up to two minutes ago, and is about $500 for 500 sites. (It can be installed on multiple servers for load balancing for no extra cost.)

Well, then, Dale, that should give you some things to consider! See how helpful and nice most of the Newsgroup readers and posters are? What a great group of IT Pros!!

SQL Server Load Balancing

Q: Hi all,

We are setting up a web site where there will be a lot of heavy duty database queries by clients using Active X components calling COM+ components on front end IIS servers. Our database will only be about 4GB but I would like to set up 2-3 SQL Servers with full identical databases so that I can load balance the query activity between them. The problem comes in when clients write to the databases. The client may write to one and then try to read what they have written from a different machine. I am looking into transactional replication but am not sure how fast and reliable it will be under a load.

Is there a way of doing what I am trying to do through SQL 7 or 2000? Basically I am trying to achieve the same load balancing and fault tolerance on the back end database servers that I can have on the front end IIS servers. I know that there is partitioning, but if one machine goes down you will lose the entire database.

Thanks,
Don Payton

A: Hi Don,

Was reading your post on newsgroup microsoft.public.sqlserver.datawarehouse. You pose a very valid question in general. In order to provide any kind of useful response, I would like to point out a few things from your question that I think require quite a bit more detail – which is probably why there weren't any responses.

First: The statements "Uthere will be a lot of heavy duty database queries by clientsU" and "Udatabase will only be about 4GBU".

Comments: The statement "heavy duty database queries" is not that useful in understanding the actual usage impact on the database. There are many, many factors involved in determining the impact of queries on SQL Server (or any relational database for that matter.) Typically, there can be a specific set of queries that are run against a database that cover 80% of all the querying you're going to do against the database. By fully understanding the types of queries that users will be executing, you can use that information in designing the database for optimal performance by selecting a proper indexing structure, and table design, just to name a couple. And while I'm on the topic of queries, another thing that needs to be understood and considered is the transactional load that you anticipate – i.e., how many queries per minute and what type of queries are they – selects, updates, or deletes?

Another thing related to "heavy duty database queries" is the size of the database. As you mentioned in your question "Our database will only be about 4 GB.) As you indicated, the DB will only be 4 GB in size, and yes, that's not a large database for SQL Server by any means. With an appropriately designed database and a corresponding optimally configured server (memory, ***RAID 5 disk arrays, network throughput), you could have a SQL Server machine that easily handles many thousands of transactions per minute. What it boils down to is: "It Depends"U on many things.

There are some general technical solutions that you can consider, such as Network Load Balancing (Win2000), SQL Server on Clusters (Windows NT 4 & Win2000), 2 Phase commit (SQL 7 & 2000). You may very well be able to implement a single backend SQL Server to handle all your web's database activity. Of course, we're wasting our time talking about whether you will need to use one, two, three (whatever) DB servers to handle your activity. It's time to set up a serious testing lab and using some load simulations to see just what you see. Please consider going to the TechNet Web site and search for technical papers on the subjects you're interested in. There's lots of information there!